Apr 16 18:07:08.694122 ip-10-0-142-43 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:07:08.694134 ip-10-0-142-43 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:07:08.694142 ip-10-0-142-43 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:07:08.694374 ip-10-0-142-43 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:07:18.730853 ip-10-0-142-43 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:07:18.730865 ip-10-0-142-43 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f765436f54fd41b78e6bada58c856802 -- Apr 16 18:09:40.220238 ip-10-0-142-43 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:09:40.691161 ip-10-0-142-43 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:40.691161 ip-10-0-142-43 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:09:40.691161 ip-10-0-142-43 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:40.691161 ip-10-0-142-43 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:09:40.691161 ip-10-0-142-43 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:40.692679 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.692580 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:09:40.696122 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696099 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:40.696122 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696121 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:40.696122 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696125 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696129 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696133 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696136 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696140 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696144 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696146 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696149 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696152 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696155 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696157 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696160 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696163 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696165 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696168 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696170 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696173 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696178 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696181 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696184 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:40.696230 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696186 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696189 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696191 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696194 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696197 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696199 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696202 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696205 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696208 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696210 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696213 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696216 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696219 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696222 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696225 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696227 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696230 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696232 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696235 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696237 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:40.696732 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696240 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696243 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696246 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696248 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696251 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696254 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696256 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696259 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696262 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696265 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696267 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696270 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696272 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696275 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696279 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696281 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696284 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696287 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696289 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696292 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:40.697307 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696295 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696302 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696307 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696319 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696322 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696325 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696330 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696333 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696337 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696340 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696344 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696348 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696352 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696355 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696358 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696360 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696363 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696366 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:40.697809 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696369 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696371 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696374 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696377 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696379 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.696382 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697532 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697539 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697543 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697546 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697549 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697552 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697555 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697559 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697561 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697564 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697567 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697569 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697573 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697576 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:40.698296 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697579 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697582 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697584 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697587 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697590 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697592 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697595 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697598 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697600 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697603 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697606 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697608 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697611 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697614 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697616 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697619 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697621 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697624 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697627 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:40.698804 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697629 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697632 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697634 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697637 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697639 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697642 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697645 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697649 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697652 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697655 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697657 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697660 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697663 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697666 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697669 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697672 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697674 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697678 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697680 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697683 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:40.699322 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697686 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697688 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697691 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697694 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697697 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697701 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697705 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697708 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697711 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697714 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697716 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697720 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697723 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697725 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697728 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697730 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697733 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697736 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697738 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:40.699866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697741 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697743 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697746 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697748 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697751 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697753 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697756 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697759 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697762 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697765 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697767 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697770 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697773 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.697775 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697877 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697905 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697914 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697918 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697923 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697927 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697932 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697937 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:09:40.700358 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697941 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697944 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697948 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697951 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697955 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697958 2574 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697961 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697964 2574 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697967 2574 flags.go:64] FLAG: --cloud-config="" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697970 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697974 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697979 2574 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697982 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697986 2574 flags.go:64] FLAG: --config-dir="" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697989 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697993 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.697999 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698002 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698006 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698010 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698013 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698016 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698019 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698022 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698025 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:09:40.700931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698030 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698033 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698036 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698039 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698043 2574 flags.go:64] FLAG: --enable-server="true" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698045 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698053 2574 flags.go:64] FLAG: --event-burst="100" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698056 2574 flags.go:64] FLAG: --event-qps="50" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698059 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698063 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698066 2574 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698070 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698073 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698076 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698079 2574 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698082 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698085 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698089 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698092 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698095 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698097 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698100 2574 flags.go:64] FLAG: --feature-gates="" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698105 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698108 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698111 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:09:40.701592 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698114 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698117 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698121 2574 flags.go:64] FLAG: --help="false" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698123 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-142-43.ec2.internal" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698126 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698130 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698133 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698137 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698140 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698151 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698155 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698158 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698161 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698164 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698167 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698170 2574 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698174 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698177 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698180 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698183 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698186 2574 flags.go:64] FLAG: --lock-file="" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698189 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698193 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698196 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:09:40.702247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698202 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698206 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698209 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698212 2574 flags.go:64] FLAG: --logging-format="text" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698216 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698219 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698222 2574 flags.go:64] FLAG: --manifest-url="" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698225 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698230 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698233 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698238 2574 flags.go:64] FLAG: --max-pods="110" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698241 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698244 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698247 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698250 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698253 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698257 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698261 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698269 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698272 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698275 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698279 2574 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698282 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:09:40.702914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698288 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698291 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698294 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698297 2574 flags.go:64] FLAG: --port="10250" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698301 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698304 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-014c0998f3be0a816" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698308 2574 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698311 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698314 2574 flags.go:64] FLAG: --register-node="true" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698317 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698320 2574 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698324 2574 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698327 2574 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698330 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698333 2574 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698337 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698340 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698343 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698346 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698349 2574 flags.go:64] FLAG: --runonce="false" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698352 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698355 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698358 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698361 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698367 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698370 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:09:40.703489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698373 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698377 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698380 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698383 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698386 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698389 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698392 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698395 2574 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698398 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698404 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698407 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698410 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698414 2574 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698417 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698420 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698423 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698426 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698429 2574 flags.go:64] FLAG: --v="2" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698433 2574 flags.go:64] FLAG: --version="false" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698437 2574 flags.go:64] FLAG: --vmodule="" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698442 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.698446 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698550 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698554 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:40.704200 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698557 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698560 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698562 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698565 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698568 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698570 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698575 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698577 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698580 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698582 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698587 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698591 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698594 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698596 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698602 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698605 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698608 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698611 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698613 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:40.704794 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698616 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698619 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698622 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698625 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698627 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698630 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698632 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698635 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698637 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698640 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698643 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698646 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698649 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698651 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698654 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698656 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698659 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698661 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698664 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698669 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:40.705336 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698671 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698674 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698676 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698679 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698682 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698685 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698689 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698694 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698698 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698701 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698704 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698707 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698709 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698712 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698715 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698718 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698720 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698723 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698725 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:40.705869 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698728 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698731 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698733 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698736 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698739 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698741 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698744 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698746 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698749 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698751 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698754 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698756 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698761 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698764 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698766 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698769 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698771 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698774 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698778 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698780 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:40.706369 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698784 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:40.706939 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698787 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:40.706939 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698789 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:40.706939 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698792 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:40.706939 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698794 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:40.706939 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.698798 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:40.706939 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.699410 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:40.707241 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.707215 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:09:40.707278 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.707243 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:09:40.707309 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707299 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:40.707309 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707305 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:40.707309 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707309 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707312 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707316 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707319 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707322 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707325 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707327 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707330 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707333 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707336 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707338 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707341 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707343 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707347 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707350 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707352 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707355 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707357 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707360 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707362 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:40.707393 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707365 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707368 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707371 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707373 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707376 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707379 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707383 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707388 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707392 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707396 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707399 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707402 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707406 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707411 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707414 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707417 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707420 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707423 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707426 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:40.707973 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707428 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707431 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707434 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707436 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707439 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707442 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707444 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707447 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707450 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707452 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707455 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707458 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707461 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707463 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707466 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707469 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707472 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707474 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707477 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707480 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:40.708457 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707483 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707486 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707489 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707492 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707495 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707497 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707500 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707503 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707505 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707508 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707511 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707514 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707517 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707520 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707522 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707525 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707528 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707530 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707533 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707535 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:40.709088 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707538 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707541 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707543 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707546 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707549 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.707554 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707663 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707669 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707672 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707676 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707681 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707685 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707687 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707690 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707693 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707696 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:40.709589 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707699 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707701 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707704 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707707 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707709 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707712 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707715 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707717 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707719 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707722 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707724 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707727 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707730 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707732 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707735 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707737 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707740 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707743 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707745 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:40.710032 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707748 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707750 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707753 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707756 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707758 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707761 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707764 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707766 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707770 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707772 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707775 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707778 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707781 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707783 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707787 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707789 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707792 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707794 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707797 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707799 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:40.710514 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707804 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707807 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707810 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707813 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707816 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707818 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707821 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707839 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707842 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707845 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707847 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707850 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707853 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707855 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707858 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707860 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707863 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707865 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707869 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707872 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:40.711050 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707875 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707877 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707880 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707883 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707885 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707888 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707891 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707895 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707897 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707900 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707903 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707905 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707908 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707910 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707913 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707915 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:40.711602 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:40.707918 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:40.712055 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.707923 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:40.712055 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.708664 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:09:40.712055 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.710786 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:09:40.712055 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.711846 2574 server.go:1019] "Starting client certificate rotation" Apr 16 18:09:40.712055 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.711949 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:09:40.712055 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.711989 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:09:40.739112 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.739084 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:09:40.744562 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.744537 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:09:40.760687 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.760654 2574 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:09:40.766843 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.766806 2574 log.go:25] "Validated CRI v1 image API" Apr 16 18:09:40.768615 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.768593 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:09:40.772644 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.772606 2574 fs.go:135] Filesystem UUIDs: map[08f0eee0-c3a2-4f0b-8fb2-da0de739c78e:/dev/nvme0n1p3 67e25882-102d-4cc4-b22a-fee31401edf7:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 18:09:40.772644 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.772639 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:09:40.773309 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.773291 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:09:40.778482 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.778312 2574 manager.go:217] Machine: {Timestamp:2026-04-16 18:09:40.776494956 +0000 UTC m=+0.431729800 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3095121 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a6aeb515b3728e9f901167b15d998 SystemUUID:ec2a6aeb-515b-3728-e9f9-01167b15d998 BootID:f765436f-54fd-41b7-8e6b-ada58c856802 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1d:e9:d6:0c:bb Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1d:e9:d6:0c:bb Speed:0 Mtu:9001} {Name:ovs-system MacAddress:62:f3:27:9d:a6:86 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:09:40.778482 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.778471 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:09:40.778602 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.778576 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:09:40.779611 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.779585 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:09:40.779773 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.779613 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-43.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:09:40.779839 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.779783 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:09:40.779839 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.779790 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:09:40.779839 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.779804 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:09:40.779839 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.779818 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:09:40.781307 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.781294 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:09:40.781632 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.781621 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:09:40.784092 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.784078 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:09:40.784139 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.784103 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:09:40.784139 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.784116 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:09:40.784139 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.784128 2574 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:09:40.784139 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.784137 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:09:40.785240 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.785226 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:09:40.785299 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.785246 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:09:40.788817 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.788791 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:09:40.790939 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.790923 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:09:40.792542 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.792531 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:09:40.792604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.792547 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:09:40.792604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.792553 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:09:40.792604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.792559 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:09:40.792604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.792565 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:09:40.792604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.792571 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:09:40.792604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.792577 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:09:40.792604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.792582 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:09:40.792604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.792590 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:09:40.792604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.792597 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:09:40.792604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.792606 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:09:40.792890 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.792616 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:09:40.794569 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.794551 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:09:40.794569 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.794565 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:09:40.795785 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:40.795752 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:09:40.795785 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:40.795769 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-43.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:09:40.797279 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.797251 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-43.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:09:40.798754 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.798728 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:09:40.799014 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.799002 2574 server.go:1295] "Started kubelet" Apr 16 18:09:40.799120 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.799074 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:09:40.799197 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.799145 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:09:40.799242 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.799218 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:09:40.800022 ip-10-0-142-43 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:09:40.800262 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.800246 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:09:40.801678 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.801664 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:09:40.805318 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.805298 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:09:40.805783 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.805764 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:09:40.806371 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.806353 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:09:40.806371 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.806356 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:09:40.806510 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.806384 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:09:40.806510 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.806488 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:09:40.806510 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.806498 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:09:40.806655 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.806562 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zvwqb" Apr 16 18:09:40.806761 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:40.806744 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:40.808123 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.808048 2574 factory.go:55] Registering systemd factory Apr 16 18:09:40.808123 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.808066 2574 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:09:40.808416 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.808402 2574 factory.go:153] Registering CRI-O factory Apr 16 18:09:40.808511 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.808503 2574 factory.go:223] Registration of the crio container factory successfully Apr 16 18:09:40.808642 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.808632 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:09:40.808721 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.808713 2574 factory.go:103] Registering Raw factory Apr 16 18:09:40.808783 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.808777 2574 manager.go:1196] Started watching for new ooms in manager Apr 16 18:09:40.809329 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.809315 2574 manager.go:319] Starting recovery of all containers Apr 16 18:09:40.809767 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:40.809738 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:09:40.814556 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.814521 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zvwqb" Apr 16 18:09:40.820319 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.820098 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:40.820492 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.820213 2574 manager.go:324] Recovery completed Apr 16 18:09:40.823778 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:40.823756 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-43.ec2.internal\" not found" node="ip-10-0-142-43.ec2.internal" Apr 16 18:09:40.825956 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.825943 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:40.828702 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.828683 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:40.828783 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.828718 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:40.828783 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.828732 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:40.829353 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.829337 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:09:40.829353 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.829351 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:09:40.829443 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.829369 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:09:40.832923 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.832907 2574 policy_none.go:49] "None policy: Start" Apr 16 18:09:40.832976 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.832930 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:09:40.832976 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.832944 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:09:40.880633 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.880615 2574 manager.go:341] "Starting Device Plugin manager" Apr 16 18:09:40.885885 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:40.880666 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:09:40.885885 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.880677 2574 server.go:85] "Starting device plugin registration server" Apr 16 18:09:40.885885 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.880968 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:09:40.885885 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.880987 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:09:40.885885 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.881073 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:09:40.885885 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.881171 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:09:40.885885 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.881180 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:09:40.885885 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:40.881711 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:09:40.885885 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:40.881750 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:40.931558 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.931520 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:09:40.932783 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.932762 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:09:40.932869 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.932800 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:09:40.932869 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.932843 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:09:40.932869 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.932854 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:09:40.932983 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:40.932898 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:09:40.935290 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.935264 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:40.981490 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.981397 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:40.982413 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.982397 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:40.982512 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.982430 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:40.982512 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.982441 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:40.982512 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.982465 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-43.ec2.internal" Apr 16 18:09:40.991641 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:40.991618 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-43.ec2.internal" Apr 16 18:09:40.991709 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:40.991647 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-43.ec2.internal\": node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:41.006405 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:41.006381 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:41.033189 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.033161 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-43.ec2.internal"] Apr 16 18:09:41.033258 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.033236 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:41.034721 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.034704 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:41.034846 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.034741 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:41.034846 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.034755 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:41.036319 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.036303 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:41.036465 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.036441 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.036511 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.036481 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:41.037399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.037381 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:41.037482 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.037411 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:41.037482 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.037422 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:41.037482 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.037380 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:41.037615 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.037492 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:41.037615 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.037510 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:41.039217 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.039198 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.039303 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.039222 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:41.040013 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.039988 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:41.040109 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.040016 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:41.040109 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.040028 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:41.065876 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:41.065848 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-43.ec2.internal\" not found" node="ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.069553 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:41.069535 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-43.ec2.internal\" not found" node="ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.106533 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:41.106495 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:41.107658 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.107641 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c0563976be90271e20ad81aea226de1a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal\" (UID: \"c0563976be90271e20ad81aea226de1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.107722 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.107666 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0563976be90271e20ad81aea226de1a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal\" (UID: \"c0563976be90271e20ad81aea226de1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.107722 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.107691 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d2abeffdf9790c6ff1185a908891d8f2-config\") pod \"kube-apiserver-proxy-ip-10-0-142-43.ec2.internal\" (UID: \"d2abeffdf9790c6ff1185a908891d8f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.206588 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:41.206556 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:41.208781 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.208761 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c0563976be90271e20ad81aea226de1a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal\" (UID: \"c0563976be90271e20ad81aea226de1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.208841 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.208790 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0563976be90271e20ad81aea226de1a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal\" (UID: \"c0563976be90271e20ad81aea226de1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.208841 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.208809 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d2abeffdf9790c6ff1185a908891d8f2-config\") pod \"kube-apiserver-proxy-ip-10-0-142-43.ec2.internal\" (UID: \"d2abeffdf9790c6ff1185a908891d8f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.208922 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.208875 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c0563976be90271e20ad81aea226de1a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal\" (UID: \"c0563976be90271e20ad81aea226de1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.208922 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.208886 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d2abeffdf9790c6ff1185a908891d8f2-config\") pod \"kube-apiserver-proxy-ip-10-0-142-43.ec2.internal\" (UID: \"d2abeffdf9790c6ff1185a908891d8f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.208922 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.208893 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0563976be90271e20ad81aea226de1a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal\" (UID: \"c0563976be90271e20ad81aea226de1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.307346 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:41.307268 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:41.369638 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.369615 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.372171 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.372141 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-43.ec2.internal" Apr 16 18:09:41.407514 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:41.407480 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:41.507950 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:41.507901 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:41.608501 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:41.608425 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:41.708933 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:41.708904 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:41.713241 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.713219 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:09:41.713382 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.713365 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:09:41.713465 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.713391 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:09:41.805999 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.805972 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:09:41.809259 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:41.809234 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:41.816511 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.816465 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:04:40 +0000 UTC" deadline="2028-01-30 12:00:43.294626542 +0000 UTC" Apr 16 18:09:41.816511 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.816503 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15689h51m1.478127188s" Apr 16 18:09:41.818708 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.818689 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:09:41.844349 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.844323 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8gskw" Apr 16 18:09:41.850602 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.850577 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8gskw" Apr 16 18:09:41.854866 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:41.854819 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2abeffdf9790c6ff1185a908891d8f2.slice/crio-5de28e11bab494b21aa8e73853c4b221552e05ca083eb935dbed0ea7391d709e WatchSource:0}: Error finding container 5de28e11bab494b21aa8e73853c4b221552e05ca083eb935dbed0ea7391d709e: Status 404 returned error can't find the container with id 5de28e11bab494b21aa8e73853c4b221552e05ca083eb935dbed0ea7391d709e Apr 16 18:09:41.855150 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:41.855131 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0563976be90271e20ad81aea226de1a.slice/crio-48fe908c0ded3bf1c38dcc708cea31f3871292038e7f1028823bf29896b1fd6b WatchSource:0}: Error finding container 48fe908c0ded3bf1c38dcc708cea31f3871292038e7f1028823bf29896b1fd6b: Status 404 returned error can't find the container with id 48fe908c0ded3bf1c38dcc708cea31f3871292038e7f1028823bf29896b1fd6b Apr 16 18:09:41.859226 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.859212 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:09:41.909656 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:41.909602 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-43.ec2.internal\" not found" Apr 16 18:09:41.935871 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.935802 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-43.ec2.internal" event={"ID":"d2abeffdf9790c6ff1185a908891d8f2","Type":"ContainerStarted","Data":"5de28e11bab494b21aa8e73853c4b221552e05ca083eb935dbed0ea7391d709e"} Apr 16 18:09:41.937186 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:41.937165 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" event={"ID":"c0563976be90271e20ad81aea226de1a","Type":"ContainerStarted","Data":"48fe908c0ded3bf1c38dcc708cea31f3871292038e7f1028823bf29896b1fd6b"} Apr 16 18:09:42.008867 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.008846 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:42.107021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.106971 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" Apr 16 18:09:42.115478 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.115401 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:42.118877 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.118857 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:09:42.119792 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.119779 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-43.ec2.internal" Apr 16 18:09:42.127451 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.127431 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:09:42.727341 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.727304 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:42.784912 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.784877 2574 apiserver.go:52] "Watching apiserver" Apr 16 18:09:42.795278 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.795248 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:09:42.795673 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.795648 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-lwp8t","openshift-multus/multus-t2zvc","openshift-multus/network-metrics-daemon-kgtvr","openshift-ovn-kubernetes/ovnkube-node-wt28v","kube-system/global-pull-secret-syncer-sx24k","kube-system/konnectivity-agent-v5jrt","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6","openshift-cluster-node-tuning-operator/tuned-6g7j6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal","openshift-network-diagnostics/network-check-target-fswkr","openshift-network-operator/iptables-alerter-54l5n","kube-system/kube-apiserver-proxy-ip-10-0-142-43.ec2.internal","openshift-dns/node-resolver-b7tsj","openshift-image-registry/node-ca-fd7p9"] Apr 16 18:09:42.798957 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.798924 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:42.799097 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:42.799021 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:09:42.799168 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.799093 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-54l5n" Apr 16 18:09:42.800287 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.800260 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.801486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.801458 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jphs5\"" Apr 16 18:09:42.801604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.801492 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:09:42.801604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.801550 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:09:42.801741 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.801715 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:09:42.801837 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.801808 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.802419 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.802401 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kbf56\"" Apr 16 18:09:42.802589 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.802569 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:09:42.802689 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.802668 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:09:42.802778 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.802682 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:09:42.802778 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.802728 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:09:42.803189 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.803169 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:09:42.803297 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.803282 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:09:42.803451 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.803428 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:42.803616 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:42.803508 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:09:42.804276 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.804257 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:09:42.804678 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.804440 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:09:42.804678 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.804487 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:09:42.804678 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.804502 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:09:42.804678 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.804505 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dlljj\"" Apr 16 18:09:42.804678 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.804672 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:09:42.805355 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.805043 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v5jrt" Apr 16 18:09:42.806459 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.806436 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.808101 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.807803 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-95cd9\"" Apr 16 18:09:42.808101 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.807960 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:09:42.808101 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.808088 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:09:42.808101 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.808098 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.809172 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.809148 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:09:42.809266 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.809160 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:09:42.810397 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.810378 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:42.810508 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:42.810477 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:09:42.811266 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.811248 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:09:42.811363 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.811336 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kt97x\"" Apr 16 18:09:42.811895 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.811657 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:09:42.811895 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.811691 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-svd72\"" Apr 16 18:09:42.811895 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.811735 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:09:42.812498 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.812475 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.814415 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.814393 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b7tsj" Apr 16 18:09:42.814785 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.814767 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-srkb8\"" Apr 16 18:09:42.814887 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.814805 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:09:42.815987 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.815969 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fd7p9" Apr 16 18:09:42.816647 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816626 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:09:42.816721 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816670 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.816781 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816718 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d40f8597-4c6c-46ba-9f26-4cea171429a6-ovnkube-script-lib\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.816781 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:42.816781 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816772 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-systemd\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.816971 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816797 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdvfk\" (UniqueName: \"kubernetes.io/projected/33611e08-8e61-4b6d-ae3b-e6045feb85f0-kube-api-access-cdvfk\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.816971 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816820 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-run-netns\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.816971 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816861 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-log-socket\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.816971 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816886 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-var-lib-kubelet\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.816971 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816908 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-tuned\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.816971 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816935 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-run-systemd\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.816971 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816956 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-var-lib-openvswitch\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.816971 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816965 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:09:42.816971 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-sys-fs\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.816990 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-cnibin\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817006 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33611e08-8e61-4b6d-ae3b-e6045feb85f0-tmp\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817026 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-system-cni-dir\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817039 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8bqjb\"" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817043 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-cni-binary-copy\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817096 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817128 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817161 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c7dc3386-c234-45e0-91df-85d871e4cbbd-iptables-alerter-script\") pod \"iptables-alerter-54l5n\" (UID: \"c7dc3386-c234-45e0-91df-85d871e4cbbd\") " pod="openshift-network-operator/iptables-alerter-54l5n" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817190 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7dc3386-c234-45e0-91df-85d871e4cbbd-host-slash\") pod \"iptables-alerter-54l5n\" (UID: \"c7dc3386-c234-45e0-91df-85d871e4cbbd\") " pod="openshift-network-operator/iptables-alerter-54l5n" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817230 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-host\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817269 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-946x6\" (UniqueName: \"kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6\") pod \"network-check-target-fswkr\" (UID: \"4b2b1c61-3440-4f11-9320-0d47781218e5\") " pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817293 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-etc-openvswitch\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817320 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-device-dir\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817351 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-cni-netd\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lhn\" (UniqueName: \"kubernetes.io/projected/d40f8597-4c6c-46ba-9f26-4cea171429a6-kube-api-access-68lhn\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.817399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817402 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-kubernetes\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-dbus\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817472 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-systemd-units\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817508 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d40f8597-4c6c-46ba-9f26-4cea171429a6-env-overrides\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817539 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-kubelet-config\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817564 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-slash\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817585 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817610 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-socket-dir\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817626 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-sysctl-d\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-lib-modules\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817674 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-cni-bin\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817709 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-registration-dir\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817739 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-sysconfig\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817764 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-run\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817788 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-run-openvswitch\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817813 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d40f8597-4c6c-46ba-9f26-4cea171429a6-ovn-node-metrics-cert\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.818100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817856 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c6vf\" (UniqueName: \"kubernetes.io/projected/0d695f1e-2c44-488a-8185-a74fe3736440-kube-api-access-9c6vf\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-os-release\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817905 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vqpf\" (UniqueName: \"kubernetes.io/projected/c7dc3386-c234-45e0-91df-85d871e4cbbd-kube-api-access-5vqpf\") pod \"iptables-alerter-54l5n\" (UID: \"c7dc3386-c234-45e0-91df-85d871e4cbbd\") " pod="openshift-network-operator/iptables-alerter-54l5n" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817940 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-kubelet\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817966 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-etc-selinux\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.817987 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-modprobe-d\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818019 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-run-ovn\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9-konnectivity-ca\") pod \"konnectivity-agent-v5jrt\" (UID: \"7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9\") " pod="kube-system/konnectivity-agent-v5jrt" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818086 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818112 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hplh\" (UniqueName: \"kubernetes.io/projected/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-kube-api-access-4hplh\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818176 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-sysctl-conf\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818205 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-sys\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d40f8597-4c6c-46ba-9f26-4cea171429a6-ovnkube-config\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818240 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818260 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-node-log\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818292 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9-agent-certs\") pod \"konnectivity-agent-v5jrt\" (UID: \"7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9\") " pod="kube-system/konnectivity-agent-v5jrt" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818483 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:09:42.818774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818594 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:09:42.819512 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.818808 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mg4vw\"" Apr 16 18:09:42.851579 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.851544 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:04:41 +0000 UTC" deadline="2027-10-28 21:46:47.198383376 +0000 UTC" Apr 16 18:09:42.851579 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.851574 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13443h37m4.346813023s" Apr 16 18:09:42.907390 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.907356 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:09:42.919002 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.918959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-os-release\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.919184 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919029 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqpf\" (UniqueName: \"kubernetes.io/projected/c7dc3386-c234-45e0-91df-85d871e4cbbd-kube-api-access-5vqpf\") pod \"iptables-alerter-54l5n\" (UID: \"c7dc3386-c234-45e0-91df-85d871e4cbbd\") " pod="openshift-network-operator/iptables-alerter-54l5n" Apr 16 18:09:42.919184 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919068 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-kubelet\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.919184 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-os-release\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.919184 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919099 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fe74f00-c50b-4f93-a926-43b61e8e6182-cni-binary-copy\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.919184 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919124 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-hostroot\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.919184 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919148 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4j5q\" (UniqueName: \"kubernetes.io/projected/0fe74f00-c50b-4f93-a926-43b61e8e6182-kube-api-access-w4j5q\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.919184 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/25e589f1-86e1-42cd-a623-02b4361d82ee-hosts-file\") pod \"node-resolver-b7tsj\" (UID: \"25e589f1-86e1-42cd-a623-02b4361d82ee\") " pod="openshift-dns/node-resolver-b7tsj" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-etc-selinux\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-modprobe-d\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-run-ovn\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919277 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-os-release\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919305 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9-konnectivity-ca\") pod \"konnectivity-agent-v5jrt\" (UID: \"7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9\") " pod="kube-system/konnectivity-agent-v5jrt" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919345 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-run-ovn\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919375 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hplh\" (UniqueName: \"kubernetes.io/projected/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-kube-api-access-4hplh\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-kubelet\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919345 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-etc-selinux\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919408 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-sysctl-conf\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919389 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-modprobe-d\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-sys\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d40f8597-4c6c-46ba-9f26-4cea171429a6-ovnkube-config\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.919571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919566 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-sysctl-conf\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-var-lib-cni-bin\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919627 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-sys\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-node-log\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4kz7\" (UniqueName: \"kubernetes.io/projected/25e589f1-86e1-42cd-a623-02b4361d82ee-kube-api-access-n4kz7\") pod \"node-resolver-b7tsj\" (UID: \"25e589f1-86e1-42cd-a623-02b4361d82ee\") " pod="openshift-dns/node-resolver-b7tsj" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919702 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-node-log\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9-agent-certs\") pod \"konnectivity-agent-v5jrt\" (UID: \"7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9\") " pod="kube-system/konnectivity-agent-v5jrt" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919784 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919846 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d40f8597-4c6c-46ba-9f26-4cea171429a6-ovnkube-script-lib\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919876 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-multus-cni-dir\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919900 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-multus-conf-dir\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919934 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919951 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-systemd\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.919975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdvfk\" (UniqueName: \"kubernetes.io/projected/33611e08-8e61-4b6d-ae3b-e6045feb85f0-kube-api-access-cdvfk\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920006 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-run-netns\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920024 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-log-socket\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.920273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920039 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-multus-socket-dir-parent\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920039 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9-konnectivity-ca\") pod \"konnectivity-agent-v5jrt\" (UID: \"7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9\") " pod="kube-system/konnectivity-agent-v5jrt" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920065 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-run-netns\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920061 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-var-lib-kubelet\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920106 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-tuned\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-run-systemd\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d40f8597-4c6c-46ba-9f26-4cea171429a6-ovnkube-config\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920143 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-systemd\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-var-lib-openvswitch\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920163 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-etc-kubernetes\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920195 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-var-lib-openvswitch\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-sys-fs\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920210 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-cnibin\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:42.920255 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920274 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33611e08-8e61-4b6d-ae3b-e6045feb85f0-tmp\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.921081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920299 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-system-cni-dir\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920314 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-sys-fs\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:42.920336 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret podName:15fcda3c-2ebe-475b-bd0f-7c9f1ed74875 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:43.420304986 +0000 UTC m=+3.075539834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret") pod "global-pull-secret-syncer-sx24k" (UID: "15fcda3c-2ebe-475b-bd0f-7c9f1ed74875") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920360 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-cnibin\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920399 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-run-netns\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-var-lib-kubelet\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920435 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-log-socket\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920334 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-system-cni-dir\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920495 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-run-systemd\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-cni-binary-copy\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920554 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d40f8597-4c6c-46ba-9f26-4cea171429a6-ovnkube-script-lib\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920565 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c7dc3386-c234-45e0-91df-85d871e4cbbd-iptables-alerter-script\") pod \"iptables-alerter-54l5n\" (UID: \"c7dc3386-c234-45e0-91df-85d871e4cbbd\") " pod="openshift-network-operator/iptables-alerter-54l5n" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7dc3386-c234-45e0-91df-85d871e4cbbd-host-slash\") pod \"iptables-alerter-54l5n\" (UID: \"c7dc3386-c234-45e0-91df-85d871e4cbbd\") " pod="openshift-network-operator/iptables-alerter-54l5n" Apr 16 18:09:42.921932 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920669 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920701 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltttf\" (UniqueName: \"kubernetes.io/projected/182ef3ca-8527-40a2-b1a7-c714bd3509c5-kube-api-access-ltttf\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920717 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7dc3386-c234-45e0-91df-85d871e4cbbd-host-slash\") pod \"iptables-alerter-54l5n\" (UID: \"c7dc3386-c234-45e0-91df-85d871e4cbbd\") " pod="openshift-network-operator/iptables-alerter-54l5n" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920730 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-var-lib-cni-multus\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920796 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fe74f00-c50b-4f93-a926-43b61e8e6182-multus-daemon-config\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920860 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-host\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-946x6\" (UniqueName: \"kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6\") pod \"network-check-target-fswkr\" (UID: \"4b2b1c61-3440-4f11-9320-0d47781218e5\") " pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920903 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-host\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-etc-openvswitch\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.920960 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-etc-openvswitch\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921013 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-cni-binary-copy\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0-host\") pod \"node-ca-fd7p9\" (UID: \"8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0\") " pod="openshift-image-registry/node-ca-fd7p9" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921104 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-device-dir\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-cni-netd\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921149 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-device-dir\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68lhn\" (UniqueName: \"kubernetes.io/projected/d40f8597-4c6c-46ba-9f26-4cea171429a6-kube-api-access-68lhn\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.922657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921182 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c7dc3386-c234-45e0-91df-85d871e4cbbd-iptables-alerter-script\") pod \"iptables-alerter-54l5n\" (UID: \"c7dc3386-c234-45e0-91df-85d871e4cbbd\") " pod="openshift-network-operator/iptables-alerter-54l5n" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921193 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-cni-netd\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921242 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx4v9\" (UniqueName: \"kubernetes.io/projected/8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0-kube-api-access-jx4v9\") pod \"node-ca-fd7p9\" (UID: \"8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0\") " pod="openshift-image-registry/node-ca-fd7p9" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921277 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-kubernetes\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-dbus\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921331 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-systemd-units\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921355 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d40f8597-4c6c-46ba-9f26-4cea171429a6-env-overrides\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921368 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-kubernetes\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921388 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-cnibin\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921410 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-systemd-units\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-kubelet-config\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921448 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-slash\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921478 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-dbus\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921481 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921518 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-system-cni-dir\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921532 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-slash\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25e589f1-86e1-42cd-a623-02b4361d82ee-tmp-dir\") pod \"node-resolver-b7tsj\" (UID: \"25e589f1-86e1-42cd-a623-02b4361d82ee\") " pod="openshift-dns/node-resolver-b7tsj" Apr 16 18:09:42.923486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-kubelet-config\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0-serviceca\") pod \"node-ca-fd7p9\" (UID: \"8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0\") " pod="openshift-image-registry/node-ca-fd7p9" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921590 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921610 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-socket-dir\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921638 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-sysctl-d\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-lib-modules\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921688 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-cni-bin\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921714 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-run-multus-certs\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921745 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-sysctl-d\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-registration-dir\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921771 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-socket-dir\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d40f8597-4c6c-46ba-9f26-4cea171429a6-env-overrides\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921789 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-sysconfig\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921795 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d695f1e-2c44-488a-8185-a74fe3736440-registration-dir\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921812 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-host-cni-bin\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-run\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-sysconfig\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.924324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-run-openvswitch\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.925021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921925 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-lib-modules\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.925021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921946 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33611e08-8e61-4b6d-ae3b-e6045feb85f0-run\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.925021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921946 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d40f8597-4c6c-46ba-9f26-4cea171429a6-ovn-node-metrics-cert\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.925021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.921973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d40f8597-4c6c-46ba-9f26-4cea171429a6-run-openvswitch\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.925021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.922020 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:42.925021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.922053 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-run-k8s-cni-cncf-io\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.925021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.922078 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-var-lib-kubelet\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:42.925021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.922110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9c6vf\" (UniqueName: \"kubernetes.io/projected/0d695f1e-2c44-488a-8185-a74fe3736440-kube-api-access-9c6vf\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:42.925021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.924163 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33611e08-8e61-4b6d-ae3b-e6045feb85f0-tmp\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.925021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.924390 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/33611e08-8e61-4b6d-ae3b-e6045feb85f0-etc-tuned\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.925021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.924603 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d40f8597-4c6c-46ba-9f26-4cea171429a6-ovn-node-metrics-cert\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.925021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.924690 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9-agent-certs\") pod \"konnectivity-agent-v5jrt\" (UID: \"7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9\") " pod="kube-system/konnectivity-agent-v5jrt" Apr 16 18:09:42.928267 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:42.928231 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:42.928267 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:42.928263 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:42.928267 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:42.928276 2574 projected.go:194] Error preparing data for projected volume kube-api-access-946x6 for pod openshift-network-diagnostics/network-check-target-fswkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:42.928533 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.928283 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vqpf\" (UniqueName: \"kubernetes.io/projected/c7dc3386-c234-45e0-91df-85d871e4cbbd-kube-api-access-5vqpf\") pod \"iptables-alerter-54l5n\" (UID: \"c7dc3386-c234-45e0-91df-85d871e4cbbd\") " pod="openshift-network-operator/iptables-alerter-54l5n" Apr 16 18:09:42.928533 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:42.928338 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6 podName:4b2b1c61-3440-4f11-9320-0d47781218e5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:43.428320103 +0000 UTC m=+3.083554938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-946x6" (UniqueName: "kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6") pod "network-check-target-fswkr" (UID: "4b2b1c61-3440-4f11-9320-0d47781218e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:42.928651 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.928630 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hplh\" (UniqueName: \"kubernetes.io/projected/a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9-kube-api-access-4hplh\") pod \"multus-additional-cni-plugins-lwp8t\" (UID: \"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9\") " pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:42.929263 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.929238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdvfk\" (UniqueName: \"kubernetes.io/projected/33611e08-8e61-4b6d-ae3b-e6045feb85f0-kube-api-access-cdvfk\") pod \"tuned-6g7j6\" (UID: \"33611e08-8e61-4b6d-ae3b-e6045feb85f0\") " pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:42.930759 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.930680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lhn\" (UniqueName: \"kubernetes.io/projected/d40f8597-4c6c-46ba-9f26-4cea171429a6-kube-api-access-68lhn\") pod \"ovnkube-node-wt28v\" (UID: \"d40f8597-4c6c-46ba-9f26-4cea171429a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:42.930949 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:42.930931 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c6vf\" (UniqueName: \"kubernetes.io/projected/0d695f1e-2c44-488a-8185-a74fe3736440-kube-api-access-9c6vf\") pod \"aws-ebs-csi-driver-node-vntm6\" (UID: \"0d695f1e-2c44-488a-8185-a74fe3736440\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:43.023276 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-run-netns\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023276 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-etc-kubernetes\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023276 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023258 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltttf\" (UniqueName: \"kubernetes.io/projected/182ef3ca-8527-40a2-b1a7-c714bd3509c5-kube-api-access-ltttf\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:43.023541 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023282 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-var-lib-cni-multus\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023541 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023305 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fe74f00-c50b-4f93-a926-43b61e8e6182-multus-daemon-config\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023541 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023327 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-etc-kubernetes\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023541 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023327 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-run-netns\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023541 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023369 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-var-lib-cni-multus\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023541 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0-host\") pod \"node-ca-fd7p9\" (UID: \"8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0\") " pod="openshift-image-registry/node-ca-fd7p9" Apr 16 18:09:43.023541 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jx4v9\" (UniqueName: \"kubernetes.io/projected/8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0-kube-api-access-jx4v9\") pod \"node-ca-fd7p9\" (UID: \"8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0\") " pod="openshift-image-registry/node-ca-fd7p9" Apr 16 18:09:43.023541 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0-host\") pod \"node-ca-fd7p9\" (UID: \"8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0\") " pod="openshift-image-registry/node-ca-fd7p9" Apr 16 18:09:43.023541 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023532 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-cnibin\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023565 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-system-cni-dir\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023590 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25e589f1-86e1-42cd-a623-02b4361d82ee-tmp-dir\") pod \"node-resolver-b7tsj\" (UID: \"25e589f1-86e1-42cd-a623-02b4361d82ee\") " pod="openshift-dns/node-resolver-b7tsj" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023609 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-cnibin\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0-serviceca\") pod \"node-ca-fd7p9\" (UID: \"8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0\") " pod="openshift-image-registry/node-ca-fd7p9" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023695 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-run-multus-certs\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023652 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-system-cni-dir\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023753 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023781 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-run-k8s-cni-cncf-io\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-run-multus-certs\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-var-lib-kubelet\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fe74f00-c50b-4f93-a926-43b61e8e6182-cni-binary-copy\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023869 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-run-k8s-cni-cncf-io\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-hostroot\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:43.023904 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023915 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-var-lib-kubelet\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4j5q\" (UniqueName: \"kubernetes.io/projected/0fe74f00-c50b-4f93-a926-43b61e8e6182-kube-api-access-w4j5q\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.023955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023955 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/25e589f1-86e1-42cd-a623-02b4361d82ee-hosts-file\") pod \"node-resolver-b7tsj\" (UID: \"25e589f1-86e1-42cd-a623-02b4361d82ee\") " pod="openshift-dns/node-resolver-b7tsj" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:43.023974 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs podName:182ef3ca-8527-40a2-b1a7-c714bd3509c5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:43.523952834 +0000 UTC m=+3.179187666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs") pod "network-metrics-daemon-kgtvr" (UID: "182ef3ca-8527-40a2-b1a7-c714bd3509c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023978 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25e589f1-86e1-42cd-a623-02b4361d82ee-tmp-dir\") pod \"node-resolver-b7tsj\" (UID: \"25e589f1-86e1-42cd-a623-02b4361d82ee\") " pod="openshift-dns/node-resolver-b7tsj" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024002 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/25e589f1-86e1-42cd-a623-02b4361d82ee-hosts-file\") pod \"node-resolver-b7tsj\" (UID: \"25e589f1-86e1-42cd-a623-02b4361d82ee\") " pod="openshift-dns/node-resolver-b7tsj" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.023988 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fe74f00-c50b-4f93-a926-43b61e8e6182-multus-daemon-config\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024073 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-os-release\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024013 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-os-release\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024119 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-var-lib-cni-bin\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024146 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4kz7\" (UniqueName: \"kubernetes.io/projected/25e589f1-86e1-42cd-a623-02b4361d82ee-kube-api-access-n4kz7\") pod \"node-resolver-b7tsj\" (UID: \"25e589f1-86e1-42cd-a623-02b4361d82ee\") " pod="openshift-dns/node-resolver-b7tsj" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024073 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-hostroot\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-multus-cni-dir\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024281 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-multus-conf-dir\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024302 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-host-var-lib-cni-bin\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024304 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-multus-cni-dir\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-multus-conf-dir\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024379 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-multus-socket-dir-parent\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024466 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fe74f00-c50b-4f93-a926-43b61e8e6182-multus-socket-dir-parent\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024511 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fe74f00-c50b-4f93-a926-43b61e8e6182-cni-binary-copy\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.024739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.024614 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0-serviceca\") pod \"node-ca-fd7p9\" (UID: \"8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0\") " pod="openshift-image-registry/node-ca-fd7p9" Apr 16 18:09:43.034502 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.034466 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltttf\" (UniqueName: \"kubernetes.io/projected/182ef3ca-8527-40a2-b1a7-c714bd3509c5-kube-api-access-ltttf\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:43.034670 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.034645 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4j5q\" (UniqueName: \"kubernetes.io/projected/0fe74f00-c50b-4f93-a926-43b61e8e6182-kube-api-access-w4j5q\") pod \"multus-t2zvc\" (UID: \"0fe74f00-c50b-4f93-a926-43b61e8e6182\") " pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.035012 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.034945 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4kz7\" (UniqueName: \"kubernetes.io/projected/25e589f1-86e1-42cd-a623-02b4361d82ee-kube-api-access-n4kz7\") pod \"node-resolver-b7tsj\" (UID: \"25e589f1-86e1-42cd-a623-02b4361d82ee\") " pod="openshift-dns/node-resolver-b7tsj" Apr 16 18:09:43.035276 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.035254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx4v9\" (UniqueName: \"kubernetes.io/projected/8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0-kube-api-access-jx4v9\") pod \"node-ca-fd7p9\" (UID: \"8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0\") " pod="openshift-image-registry/node-ca-fd7p9" Apr 16 18:09:43.108663 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.108619 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:43.114117 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.114084 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-54l5n" Apr 16 18:09:43.121056 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.121024 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:09:43.130100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.130058 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lwp8t" Apr 16 18:09:43.137026 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.136994 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v5jrt" Apr 16 18:09:43.144736 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.144706 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" Apr 16 18:09:43.152538 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.152496 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" Apr 16 18:09:43.159381 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.159352 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t2zvc" Apr 16 18:09:43.167177 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.167143 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b7tsj" Apr 16 18:09:43.171961 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.171927 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fd7p9" Apr 16 18:09:43.427571 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.427537 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:43.427749 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:43.427648 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:43.427749 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:43.427716 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret podName:15fcda3c-2ebe-475b-bd0f-7c9f1ed74875 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:44.427696845 +0000 UTC m=+4.082931677 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret") pod "global-pull-secret-syncer-sx24k" (UID: "15fcda3c-2ebe-475b-bd0f-7c9f1ed74875") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:43.508910 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:43.508874 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7075ef9f_68cc_485d_bc93_6ebf4ae1fdd9.slice/crio-ee423a68cc4de2989c59e0556128c78c2efe49885807718758201af89000aa0c WatchSource:0}: Error finding container ee423a68cc4de2989c59e0556128c78c2efe49885807718758201af89000aa0c: Status 404 returned error can't find the container with id ee423a68cc4de2989c59e0556128c78c2efe49885807718758201af89000aa0c Apr 16 18:09:43.511629 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:43.511602 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fe74f00_c50b_4f93_a926_43b61e8e6182.slice/crio-76c82df424b622504e12a4ffcd1a2f63a4c505334e508487ef7fdf3c0891b38d WatchSource:0}: Error finding container 76c82df424b622504e12a4ffcd1a2f63a4c505334e508487ef7fdf3c0891b38d: Status 404 returned error can't find the container with id 76c82df424b622504e12a4ffcd1a2f63a4c505334e508487ef7fdf3c0891b38d Apr 16 18:09:43.515733 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:43.515673 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33611e08_8e61_4b6d_ae3b_e6045feb85f0.slice/crio-c73c2e698c3a7c1230c58ebfbcbdafdafdfd7199b61989d71228ab0ecd13a043 WatchSource:0}: Error finding container c73c2e698c3a7c1230c58ebfbcbdafdafdfd7199b61989d71228ab0ecd13a043: Status 404 returned error can't find the container with id c73c2e698c3a7c1230c58ebfbcbdafdafdfd7199b61989d71228ab0ecd13a043 Apr 16 18:09:43.519507 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:43.518609 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25e589f1_86e1_42cd_a623_02b4361d82ee.slice/crio-856a13008c2f605eeb689aae9a1cb993ba214951e22dfbd23c210f42d4e448d8 WatchSource:0}: Error finding container 856a13008c2f605eeb689aae9a1cb993ba214951e22dfbd23c210f42d4e448d8: Status 404 returned error can't find the container with id 856a13008c2f605eeb689aae9a1cb993ba214951e22dfbd23c210f42d4e448d8 Apr 16 18:09:43.525189 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:43.525159 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d695f1e_2c44_488a_8185_a74fe3736440.slice/crio-b08bd5f771a3536176b469f4194cb9becd4a83106d2395a60a38dc015de049aa WatchSource:0}: Error finding container b08bd5f771a3536176b469f4194cb9becd4a83106d2395a60a38dc015de049aa: Status 404 returned error can't find the container with id b08bd5f771a3536176b469f4194cb9becd4a83106d2395a60a38dc015de049aa Apr 16 18:09:43.525755 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:43.525715 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2b2c2ec_3dc4_4bfe_9e87_dd6fa7f7edc9.slice/crio-1980fa6d26a0ef73b2d041cd9f687e09372114dba6e8cf64bc6779e6fb2108bf WatchSource:0}: Error finding container 1980fa6d26a0ef73b2d041cd9f687e09372114dba6e8cf64bc6779e6fb2108bf: Status 404 returned error can't find the container with id 1980fa6d26a0ef73b2d041cd9f687e09372114dba6e8cf64bc6779e6fb2108bf Apr 16 18:09:43.526518 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:43.526490 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7dc651_2c6d_4d3f_912d_c2f49dfd76a0.slice/crio-168da12be05c8584cba7ed60dc6e6803c90783f60a65ad2c321614fe6f344690 WatchSource:0}: Error finding container 168da12be05c8584cba7ed60dc6e6803c90783f60a65ad2c321614fe6f344690: Status 404 returned error can't find the container with id 168da12be05c8584cba7ed60dc6e6803c90783f60a65ad2c321614fe6f344690 Apr 16 18:09:43.527528 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:09:43.527373 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7dc3386_c234_45e0_91df_85d871e4cbbd.slice/crio-83bbbcf470e70c7ee469f76a531542f37edb6f66158ebb1d72d03f7d2efb2e2a WatchSource:0}: Error finding container 83bbbcf470e70c7ee469f76a531542f37edb6f66158ebb1d72d03f7d2efb2e2a: Status 404 returned error can't find the container with id 83bbbcf470e70c7ee469f76a531542f37edb6f66158ebb1d72d03f7d2efb2e2a Apr 16 18:09:43.528001 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.527971 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-946x6\" (UniqueName: \"kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6\") pod \"network-check-target-fswkr\" (UID: \"4b2b1c61-3440-4f11-9320-0d47781218e5\") " pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:43.528090 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.528040 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:43.528314 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:43.528135 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:43.528314 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:43.528156 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:43.528314 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:43.528162 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:43.528314 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:43.528171 2574 projected.go:194] Error preparing data for projected volume kube-api-access-946x6 for pod openshift-network-diagnostics/network-check-target-fswkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:43.528314 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:43.528201 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs podName:182ef3ca-8527-40a2-b1a7-c714bd3509c5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:44.528187366 +0000 UTC m=+4.183422197 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs") pod "network-metrics-daemon-kgtvr" (UID: "182ef3ca-8527-40a2-b1a7-c714bd3509c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:43.528314 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:43.528215 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6 podName:4b2b1c61-3440-4f11-9320-0d47781218e5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:44.528208647 +0000 UTC m=+4.183443478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-946x6" (UniqueName: "kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6") pod "network-check-target-fswkr" (UID: "4b2b1c61-3440-4f11-9320-0d47781218e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:43.853138 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.852673 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:04:41 +0000 UTC" deadline="2027-11-10 14:58:53.223742081 +0000 UTC" Apr 16 18:09:43.853679 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.853171 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13748h49m9.370578831s" Apr 16 18:09:43.933813 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.933273 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:43.933813 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:43.933435 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:09:43.943289 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.943229 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t2zvc" event={"ID":"0fe74f00-c50b-4f93-a926-43b61e8e6182","Type":"ContainerStarted","Data":"76c82df424b622504e12a4ffcd1a2f63a4c505334e508487ef7fdf3c0891b38d"} Apr 16 18:09:43.945015 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.944954 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v5jrt" event={"ID":"7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9","Type":"ContainerStarted","Data":"ee423a68cc4de2989c59e0556128c78c2efe49885807718758201af89000aa0c"} Apr 16 18:09:43.953494 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.953448 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-43.ec2.internal" event={"ID":"d2abeffdf9790c6ff1185a908891d8f2","Type":"ContainerStarted","Data":"e049fb847a63c112a829bd3050f8c792e41b26443ae5f6427189d03337b1cdd0"} Apr 16 18:09:43.956282 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.956243 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-54l5n" event={"ID":"c7dc3386-c234-45e0-91df-85d871e4cbbd","Type":"ContainerStarted","Data":"83bbbcf470e70c7ee469f76a531542f37edb6f66158ebb1d72d03f7d2efb2e2a"} Apr 16 18:09:43.960845 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.960776 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lwp8t" event={"ID":"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9","Type":"ContainerStarted","Data":"1980fa6d26a0ef73b2d041cd9f687e09372114dba6e8cf64bc6779e6fb2108bf"} Apr 16 18:09:43.965623 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.965591 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" event={"ID":"d40f8597-4c6c-46ba-9f26-4cea171429a6","Type":"ContainerStarted","Data":"c179c47ad31ffc290bb352269f6fad8932d6eda66fd046a78588caf7fd1c8a96"} Apr 16 18:09:43.968373 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.968310 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-43.ec2.internal" podStartSLOduration=1.9682931959999999 podStartE2EDuration="1.968293196s" podCreationTimestamp="2026-04-16 18:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:09:43.967986422 +0000 UTC m=+3.623221280" watchObservedRunningTime="2026-04-16 18:09:43.968293196 +0000 UTC m=+3.623528052" Apr 16 18:09:43.969507 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.969479 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b7tsj" event={"ID":"25e589f1-86e1-42cd-a623-02b4361d82ee","Type":"ContainerStarted","Data":"856a13008c2f605eeb689aae9a1cb993ba214951e22dfbd23c210f42d4e448d8"} Apr 16 18:09:43.972789 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.972757 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" event={"ID":"33611e08-8e61-4b6d-ae3b-e6045feb85f0","Type":"ContainerStarted","Data":"c73c2e698c3a7c1230c58ebfbcbdafdafdfd7199b61989d71228ab0ecd13a043"} Apr 16 18:09:43.976810 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.976779 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fd7p9" event={"ID":"8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0","Type":"ContainerStarted","Data":"168da12be05c8584cba7ed60dc6e6803c90783f60a65ad2c321614fe6f344690"} Apr 16 18:09:43.981423 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:43.981365 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" event={"ID":"0d695f1e-2c44-488a-8185-a74fe3736440","Type":"ContainerStarted","Data":"b08bd5f771a3536176b469f4194cb9becd4a83106d2395a60a38dc015de049aa"} Apr 16 18:09:44.436787 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:44.436749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:44.436982 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:44.436964 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:44.437048 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:44.437037 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret podName:15fcda3c-2ebe-475b-bd0f-7c9f1ed74875 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:46.437016887 +0000 UTC m=+6.092251731 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret") pod "global-pull-secret-syncer-sx24k" (UID: "15fcda3c-2ebe-475b-bd0f-7c9f1ed74875") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:44.537255 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:44.537213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-946x6\" (UniqueName: \"kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6\") pod \"network-check-target-fswkr\" (UID: \"4b2b1c61-3440-4f11-9320-0d47781218e5\") " pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:44.537415 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:44.537280 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:44.537415 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:44.537405 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:44.537523 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:44.537470 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs podName:182ef3ca-8527-40a2-b1a7-c714bd3509c5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:46.537449768 +0000 UTC m=+6.192684615 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs") pod "network-metrics-daemon-kgtvr" (UID: "182ef3ca-8527-40a2-b1a7-c714bd3509c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:44.537944 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:44.537924 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:44.537944 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:44.537946 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:44.538128 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:44.537958 2574 projected.go:194] Error preparing data for projected volume kube-api-access-946x6 for pod openshift-network-diagnostics/network-check-target-fswkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:44.538128 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:44.538016 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6 podName:4b2b1c61-3440-4f11-9320-0d47781218e5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:46.537987019 +0000 UTC m=+6.193221850 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-946x6" (UniqueName: "kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6") pod "network-check-target-fswkr" (UID: "4b2b1c61-3440-4f11-9320-0d47781218e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:44.933761 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:44.933121 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:44.933761 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:44.933264 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:09:44.933761 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:44.933127 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:44.933761 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:44.933371 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:09:44.998513 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:44.998473 2574 generic.go:358] "Generic (PLEG): container finished" podID="c0563976be90271e20ad81aea226de1a" containerID="ac0d655a32abb861239aac6a13dd2f4edbf00abb4ba9acbeeaae6956fe0bafe3" exitCode=0 Apr 16 18:09:44.999460 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:44.999427 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" event={"ID":"c0563976be90271e20ad81aea226de1a","Type":"ContainerDied","Data":"ac0d655a32abb861239aac6a13dd2f4edbf00abb4ba9acbeeaae6956fe0bafe3"} Apr 16 18:09:45.933950 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:45.933355 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:45.933950 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:45.933521 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:09:46.010108 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:46.010064 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" event={"ID":"c0563976be90271e20ad81aea226de1a","Type":"ContainerStarted","Data":"76f886d38c24f1ed18e3109419356cf2ccdd2518c4dc5cd6ad7cfa5375111e3b"} Apr 16 18:09:46.453205 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:46.453152 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:46.453405 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:46.453306 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:46.453405 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:46.453372 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret podName:15fcda3c-2ebe-475b-bd0f-7c9f1ed74875 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:50.453354057 +0000 UTC m=+10.108588906 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret") pod "global-pull-secret-syncer-sx24k" (UID: "15fcda3c-2ebe-475b-bd0f-7c9f1ed74875") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:46.553748 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:46.553708 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-946x6\" (UniqueName: \"kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6\") pod \"network-check-target-fswkr\" (UID: \"4b2b1c61-3440-4f11-9320-0d47781218e5\") " pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:46.553956 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:46.553777 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:46.553956 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:46.553922 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:46.554074 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:46.553989 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs podName:182ef3ca-8527-40a2-b1a7-c714bd3509c5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:50.553968024 +0000 UTC m=+10.209202856 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs") pod "network-metrics-daemon-kgtvr" (UID: "182ef3ca-8527-40a2-b1a7-c714bd3509c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:46.554466 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:46.554443 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:46.554466 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:46.554469 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:46.554653 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:46.554482 2574 projected.go:194] Error preparing data for projected volume kube-api-access-946x6 for pod openshift-network-diagnostics/network-check-target-fswkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:46.554653 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:46.554528 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6 podName:4b2b1c61-3440-4f11-9320-0d47781218e5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:50.554511886 +0000 UTC m=+10.209746717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-946x6" (UniqueName: "kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6") pod "network-check-target-fswkr" (UID: "4b2b1c61-3440-4f11-9320-0d47781218e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:46.933840 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:46.933759 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:46.934038 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:46.933910 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:09:46.934888 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:46.934708 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:46.934888 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:46.934841 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:09:47.933491 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:47.933452 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:47.933688 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:47.933610 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:09:48.933444 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:48.933410 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:48.933955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:48.933422 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:48.933955 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:48.933544 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:09:48.933955 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:48.933661 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:09:49.933305 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:49.933267 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:49.933503 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:49.933419 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:09:50.487798 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:50.487691 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:50.487798 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:50.487786 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret podName:15fcda3c-2ebe-475b-bd0f-7c9f1ed74875 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:58.487759412 +0000 UTC m=+18.142994266 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret") pod "global-pull-secret-syncer-sx24k" (UID: "15fcda3c-2ebe-475b-bd0f-7c9f1ed74875") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:50.490473 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:50.488357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:50.589727 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:50.589679 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:50.589945 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:50.589785 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-946x6\" (UniqueName: \"kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6\") pod \"network-check-target-fswkr\" (UID: \"4b2b1c61-3440-4f11-9320-0d47781218e5\") " pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:50.589945 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:50.589940 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:50.590057 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:50.589959 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:50.590057 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:50.589972 2574 projected.go:194] Error preparing data for projected volume kube-api-access-946x6 for pod openshift-network-diagnostics/network-check-target-fswkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:50.590057 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:50.590034 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6 podName:4b2b1c61-3440-4f11-9320-0d47781218e5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:58.590014455 +0000 UTC m=+18.245249301 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-946x6" (UniqueName: "kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6") pod "network-check-target-fswkr" (UID: "4b2b1c61-3440-4f11-9320-0d47781218e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:50.590539 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:50.590457 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:50.590539 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:50.590506 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs podName:182ef3ca-8527-40a2-b1a7-c714bd3509c5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:58.590491902 +0000 UTC m=+18.245726736 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs") pod "network-metrics-daemon-kgtvr" (UID: "182ef3ca-8527-40a2-b1a7-c714bd3509c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:50.934987 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:50.934950 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:50.935465 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:50.935090 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:09:50.935465 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:50.935150 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:50.935465 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:50.935262 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:09:51.933866 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:51.933815 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:51.934167 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:51.933977 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:09:52.933184 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:52.933149 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:52.933628 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:52.933283 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:09:52.933628 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:52.933343 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:52.933628 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:52.933451 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:09:53.933207 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:53.933158 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:53.933683 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:53.933314 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:09:54.933482 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:54.933446 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:54.933920 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:54.933564 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:09:54.933920 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:54.933624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:54.933920 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:54.933737 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:09:55.933937 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:55.933896 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:55.934366 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:55.934025 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:09:56.934071 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:56.934031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:56.934541 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:56.934076 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:56.934541 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:56.934186 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:09:56.934541 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:56.934272 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:09:57.933769 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:57.933738 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:57.933966 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:57.933867 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:09:58.549914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:58.549873 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:58.550383 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:58.550004 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:58.550383 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:58.550065 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret podName:15fcda3c-2ebe-475b-bd0f-7c9f1ed74875 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:14.5500517 +0000 UTC m=+34.205286531 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret") pod "global-pull-secret-syncer-sx24k" (UID: "15fcda3c-2ebe-475b-bd0f-7c9f1ed74875") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:58.650259 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:58.650218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-946x6\" (UniqueName: \"kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6\") pod \"network-check-target-fswkr\" (UID: \"4b2b1c61-3440-4f11-9320-0d47781218e5\") " pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:58.650259 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:58.650271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:58.650484 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:58.650384 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:58.650484 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:58.650411 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:58.650484 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:58.650434 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:58.650484 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:58.650448 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs podName:182ef3ca-8527-40a2-b1a7-c714bd3509c5 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:14.650428575 +0000 UTC m=+34.305663424 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs") pod "network-metrics-daemon-kgtvr" (UID: "182ef3ca-8527-40a2-b1a7-c714bd3509c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:58.650484 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:58.650450 2574 projected.go:194] Error preparing data for projected volume kube-api-access-946x6 for pod openshift-network-diagnostics/network-check-target-fswkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:58.650484 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:58.650488 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6 podName:4b2b1c61-3440-4f11-9320-0d47781218e5 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:14.650481058 +0000 UTC m=+34.305715890 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-946x6" (UniqueName: "kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6") pod "network-check-target-fswkr" (UID: "4b2b1c61-3440-4f11-9320-0d47781218e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:58.933695 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:58.933659 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:09:58.933880 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:58.933659 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:09:58.933880 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:58.933790 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:09:58.934010 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:58.933874 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:09:59.933598 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:09:59.933563 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:09:59.934105 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:09:59.933691 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:10:00.936701 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:00.936402 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:00.937355 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:00.936402 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:00.937355 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:00.936781 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:10:00.937355 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:00.936884 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:10:01.047820 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.047786 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fd7p9" event={"ID":"8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0","Type":"ContainerStarted","Data":"e0098bca7239c9e3fb2e19c759b4678b924acda7ad2308c2c5ae7666a35ef456"} Apr 16 18:10:01.049190 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.049165 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" event={"ID":"0d695f1e-2c44-488a-8185-a74fe3736440","Type":"ContainerStarted","Data":"7e75a103510b9726d3271c883fb3ecf6abbea14fb4780fba03e21ee006da5fef"} Apr 16 18:10:01.050522 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.050492 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t2zvc" event={"ID":"0fe74f00-c50b-4f93-a926-43b61e8e6182","Type":"ContainerStarted","Data":"58441624ea592be54ed2235237a68482be977f1a50201f64f8d108ffa9c7f497"} Apr 16 18:10:01.052037 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.051973 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v5jrt" event={"ID":"7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9","Type":"ContainerStarted","Data":"a9ff3601d3ea42476c6652360a1a17efd72ae63bbb24167fa9cf9967f95e4e11"} Apr 16 18:10:01.053329 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.053305 2574 generic.go:358] "Generic (PLEG): container finished" podID="a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9" containerID="de04981c9257111d1f23f96524916d180055deaa2c5130e4ba88c7b543f9784e" exitCode=0 Apr 16 18:10:01.053435 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.053369 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lwp8t" event={"ID":"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9","Type":"ContainerDied","Data":"de04981c9257111d1f23f96524916d180055deaa2c5130e4ba88c7b543f9784e"} Apr 16 18:10:01.055336 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.055305 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:10:01.055734 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.055706 2574 generic.go:358] "Generic (PLEG): container finished" podID="d40f8597-4c6c-46ba-9f26-4cea171429a6" containerID="9b2167e4958e72ceb7ea95a6ac9809b4cb5a33562f4c1f51358df1a13833b37d" exitCode=1 Apr 16 18:10:01.055811 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.055780 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" event={"ID":"d40f8597-4c6c-46ba-9f26-4cea171429a6","Type":"ContainerStarted","Data":"002b85b69480d038bcb9a8b94464d065d99a65d47c70a3e1c83270c88fbf3b48"} Apr 16 18:10:01.055878 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.055818 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" event={"ID":"d40f8597-4c6c-46ba-9f26-4cea171429a6","Type":"ContainerDied","Data":"9b2167e4958e72ceb7ea95a6ac9809b4cb5a33562f4c1f51358df1a13833b37d"} Apr 16 18:10:01.055878 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.055853 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" event={"ID":"d40f8597-4c6c-46ba-9f26-4cea171429a6","Type":"ContainerStarted","Data":"5c214de8f08640f8b1a060e324195556be703065e21102571395be0ac592ceac"} Apr 16 18:10:01.057225 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.057191 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b7tsj" event={"ID":"25e589f1-86e1-42cd-a623-02b4361d82ee","Type":"ContainerStarted","Data":"e51aa082ec0380f40b56054a1fbb372e29d4adde5db2f12d9e8e1c7e11fead95"} Apr 16 18:10:01.058564 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.058531 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" event={"ID":"33611e08-8e61-4b6d-ae3b-e6045feb85f0","Type":"ContainerStarted","Data":"5c7a1fff3a6802ed348fdb30470d9cb5ac0feb1c5576d1411274a250ad91215b"} Apr 16 18:10:01.064152 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.064109 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fd7p9" podStartSLOduration=3.220991392 podStartE2EDuration="20.064094962s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.529677347 +0000 UTC m=+3.184912179" lastFinishedPulling="2026-04-16 18:10:00.372780905 +0000 UTC m=+20.028015749" observedRunningTime="2026-04-16 18:10:01.063959015 +0000 UTC m=+20.719193887" watchObservedRunningTime="2026-04-16 18:10:01.064094962 +0000 UTC m=+20.719329816" Apr 16 18:10:01.064695 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.064667 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-43.ec2.internal" podStartSLOduration=19.064657972 podStartE2EDuration="19.064657972s" podCreationTimestamp="2026-04-16 18:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:09:46.026204132 +0000 UTC m=+5.681438986" watchObservedRunningTime="2026-04-16 18:10:01.064657972 +0000 UTC m=+20.719892825" Apr 16 18:10:01.102956 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.102911 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-b7tsj" podStartSLOduration=3.2520540589999998 podStartE2EDuration="20.102896135s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.521868772 +0000 UTC m=+3.177103604" lastFinishedPulling="2026-04-16 18:10:00.372710844 +0000 UTC m=+20.027945680" observedRunningTime="2026-04-16 18:10:01.08017866 +0000 UTC m=+20.735413532" watchObservedRunningTime="2026-04-16 18:10:01.102896135 +0000 UTC m=+20.758130985" Apr 16 18:10:01.119458 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.119411 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-v5jrt" podStartSLOduration=3.258064991 podStartE2EDuration="20.119399089s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.511530533 +0000 UTC m=+3.166765364" lastFinishedPulling="2026-04-16 18:10:00.372864614 +0000 UTC m=+20.028099462" observedRunningTime="2026-04-16 18:10:01.119032373 +0000 UTC m=+20.774267227" watchObservedRunningTime="2026-04-16 18:10:01.119399089 +0000 UTC m=+20.774633969" Apr 16 18:10:01.140168 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.138952 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-t2zvc" podStartSLOduration=3.243184632 podStartE2EDuration="20.13893277s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.513875663 +0000 UTC m=+3.169110494" lastFinishedPulling="2026-04-16 18:10:00.409623791 +0000 UTC m=+20.064858632" observedRunningTime="2026-04-16 18:10:01.138242046 +0000 UTC m=+20.793476901" watchObservedRunningTime="2026-04-16 18:10:01.13893277 +0000 UTC m=+20.794167624" Apr 16 18:10:01.155841 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.155773 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6g7j6" podStartSLOduration=3.301597671 podStartE2EDuration="20.155757946s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.520413719 +0000 UTC m=+3.175648567" lastFinishedPulling="2026-04-16 18:10:00.374573997 +0000 UTC m=+20.029808842" observedRunningTime="2026-04-16 18:10:01.155457648 +0000 UTC m=+20.810692520" watchObservedRunningTime="2026-04-16 18:10:01.155757946 +0000 UTC m=+20.810992799" Apr 16 18:10:01.933121 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:01.933083 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:10:01.933313 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:01.933227 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:10:02.062464 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:02.062430 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-54l5n" event={"ID":"c7dc3386-c234-45e0-91df-85d871e4cbbd","Type":"ContainerStarted","Data":"746956d439d5f9b555ab898a9c9b12332c800d8b67dc9beeca6479d608540a2d"} Apr 16 18:10:02.065521 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:02.065442 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:10:02.065931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:02.065894 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" event={"ID":"d40f8597-4c6c-46ba-9f26-4cea171429a6","Type":"ContainerStarted","Data":"94be19bfae1d32979ad518afa58c076adac4fd466639e8f6b69dd69596b4d087"} Apr 16 18:10:02.066042 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:02.065934 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" event={"ID":"d40f8597-4c6c-46ba-9f26-4cea171429a6","Type":"ContainerStarted","Data":"91c24cd803171ed72cabb91d3ec55fe12d62004a7d9678e766c8844c28daca27"} Apr 16 18:10:02.066042 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:02.065959 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" event={"ID":"d40f8597-4c6c-46ba-9f26-4cea171429a6","Type":"ContainerStarted","Data":"558a6cff3118c77996647c82d9d0737046ec1dae695440e89c9bdbd7ad0048e8"} Apr 16 18:10:02.080885 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:02.080814 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-54l5n" podStartSLOduration=5.237316795 podStartE2EDuration="22.080792198s" podCreationTimestamp="2026-04-16 18:09:40 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.529261345 +0000 UTC m=+3.184496176" lastFinishedPulling="2026-04-16 18:10:00.372736738 +0000 UTC m=+20.027971579" observedRunningTime="2026-04-16 18:10:02.079930747 +0000 UTC m=+21.735165600" watchObservedRunningTime="2026-04-16 18:10:02.080792198 +0000 UTC m=+21.736027123" Apr 16 18:10:02.259116 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:02.259086 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:10:02.894354 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:02.894227 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:10:02.259109083Z","UUID":"a0574414-d452-437c-8f72-1f24427502e2","Handler":null,"Name":"","Endpoint":""} Apr 16 18:10:02.897485 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:02.897456 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:10:02.897485 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:02.897486 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:10:02.937089 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:02.937062 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:02.937250 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:02.937190 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:10:02.937250 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:02.937062 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:02.937385 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:02.937361 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:10:03.067752 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:03.067719 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-v5jrt" Apr 16 18:10:03.068494 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:03.068471 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-v5jrt" Apr 16 18:10:03.070059 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:03.070010 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" event={"ID":"0d695f1e-2c44-488a-8185-a74fe3736440","Type":"ContainerStarted","Data":"6595bbd4a4ec90cef8e620a75349f4c93464762c7da123a7874dd9b78bcbb4ef"} Apr 16 18:10:03.933564 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:03.933532 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:10:03.933748 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:03.933664 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:10:04.074481 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:04.074437 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" event={"ID":"0d695f1e-2c44-488a-8185-a74fe3736440","Type":"ContainerStarted","Data":"80070b849e561c169d36f33e7c272abde36b56db207067d7338a7ea2b3d7eb3a"} Apr 16 18:10:04.077472 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:04.077444 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:10:04.077772 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:04.077744 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" event={"ID":"d40f8597-4c6c-46ba-9f26-4cea171429a6","Type":"ContainerStarted","Data":"a1d58341cac439530a3c76d3f0c8336757f616ede2d42499fc33cfa21b4b68c6"} Apr 16 18:10:04.077855 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:04.077812 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:10:04.094878 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:04.094806 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vntm6" podStartSLOduration=3.148192084 podStartE2EDuration="23.094788825s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.527161854 +0000 UTC m=+3.182396691" lastFinishedPulling="2026-04-16 18:10:03.473758586 +0000 UTC m=+23.128993432" observedRunningTime="2026-04-16 18:10:04.092768003 +0000 UTC m=+23.748002855" watchObservedRunningTime="2026-04-16 18:10:04.094788825 +0000 UTC m=+23.750023678" Apr 16 18:10:04.933668 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:04.933458 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:04.933860 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:04.933472 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:04.933860 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:04.933763 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:10:04.933990 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:04.933908 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:10:05.934198 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:05.933990 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:10:05.934903 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:05.934275 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:10:06.084382 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:06.084346 2574 generic.go:358] "Generic (PLEG): container finished" podID="a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9" containerID="637c70fcc62a2f47cfbb96ab3a741c414fda3b8eff90cdb0fe52abd3ce9ec5a4" exitCode=0 Apr 16 18:10:06.084539 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:06.084423 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lwp8t" event={"ID":"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9","Type":"ContainerDied","Data":"637c70fcc62a2f47cfbb96ab3a741c414fda3b8eff90cdb0fe52abd3ce9ec5a4"} Apr 16 18:10:06.087543 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:06.087526 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:10:06.087905 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:06.087882 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" event={"ID":"d40f8597-4c6c-46ba-9f26-4cea171429a6","Type":"ContainerStarted","Data":"746c9b0783af2416188991026d587a09026a952ad4bd612016ef6219f5753385"} Apr 16 18:10:06.088249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:06.088226 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:10:06.088249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:06.088247 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:10:06.088456 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:06.088440 2574 scope.go:117] "RemoveContainer" containerID="9b2167e4958e72ceb7ea95a6ac9809b4cb5a33562f4c1f51358df1a13833b37d" Apr 16 18:10:06.104421 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:06.104402 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:10:06.104818 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:06.104788 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:10:06.933226 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:06.933186 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:06.933412 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:06.933202 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:06.933412 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:06.933314 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:10:06.933412 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:06.933365 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:10:07.092392 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:07.092308 2574 generic.go:358] "Generic (PLEG): container finished" podID="a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9" containerID="31a541c13fa9ddc7519b15e9b068f1670d3ddcacf8bd50fc8bbdedd1b4ce2df1" exitCode=0 Apr 16 18:10:07.092812 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:07.092393 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lwp8t" event={"ID":"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9","Type":"ContainerDied","Data":"31a541c13fa9ddc7519b15e9b068f1670d3ddcacf8bd50fc8bbdedd1b4ce2df1"} Apr 16 18:10:07.096457 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:07.096397 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:10:07.096804 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:07.096774 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" event={"ID":"d40f8597-4c6c-46ba-9f26-4cea171429a6","Type":"ContainerStarted","Data":"0ba0b4b2a93861a33b4bf3428347f639831faff8ef8c39b31900c8635c48aff9"} Apr 16 18:10:07.096967 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:07.096951 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:10:07.149761 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:07.149715 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" podStartSLOduration=9.027998491 podStartE2EDuration="26.149697508s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.522817536 +0000 UTC m=+3.178052368" lastFinishedPulling="2026-04-16 18:10:00.644516555 +0000 UTC m=+20.299751385" observedRunningTime="2026-04-16 18:10:07.149693173 +0000 UTC m=+26.804928060" watchObservedRunningTime="2026-04-16 18:10:07.149697508 +0000 UTC m=+26.804932358" Apr 16 18:10:07.634986 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:07.634920 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-sx24k"] Apr 16 18:10:07.635199 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:07.635104 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:07.635266 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:07.635242 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:10:07.637513 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:07.637477 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kgtvr"] Apr 16 18:10:07.637658 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:07.637628 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:10:07.637764 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:07.637742 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:10:07.638115 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:07.638091 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fswkr"] Apr 16 18:10:07.638214 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:07.638189 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:07.638292 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:07.638270 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:10:08.100923 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:08.100816 2574 generic.go:358] "Generic (PLEG): container finished" podID="a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9" containerID="b7bf830718fa906e0708e89494ce763cc2af415414cee896c0baaed56082db1d" exitCode=0 Apr 16 18:10:08.101272 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:08.100953 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lwp8t" event={"ID":"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9","Type":"ContainerDied","Data":"b7bf830718fa906e0708e89494ce763cc2af415414cee896c0baaed56082db1d"} Apr 16 18:10:08.101272 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:08.101145 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:10:08.933365 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:08.933289 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:08.933365 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:08.933310 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:10:08.933604 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:08.933428 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:10:08.933813 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:08.933784 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:10:09.933219 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:09.933182 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:09.933671 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:09.933318 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:10:10.934238 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:10.934035 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:10.934702 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:10.934101 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:10:10.934702 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:10.934331 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:10:10.934702 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:10.934431 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:10:11.933762 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:11.933718 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:11.933965 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:11.933848 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:10:11.943293 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:11.943257 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:10:11.943761 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:11.943514 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:10:11.955765 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:11.955699 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" podUID="d40f8597-4c6c-46ba-9f26-4cea171429a6" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 18:10:11.966528 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:11.966489 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" podUID="d40f8597-4c6c-46ba-9f26-4cea171429a6" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 18:10:12.933879 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:12.933667 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:12.933879 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:12.933667 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:10:12.933879 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:12.933816 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sx24k" podUID="15fcda3c-2ebe-475b-bd0f-7c9f1ed74875" Apr 16 18:10:12.934316 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:12.933910 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:10:13.131701 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:13.131663 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-v5jrt" Apr 16 18:10:13.132286 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:13.131853 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:10:13.132748 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:13.132727 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-v5jrt" Apr 16 18:10:13.933768 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:13.933735 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:13.933952 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:13.933865 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fswkr" podUID="4b2b1c61-3440-4f11-9320-0d47781218e5" Apr 16 18:10:14.116639 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.116553 2574 generic.go:358] "Generic (PLEG): container finished" podID="a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9" containerID="51cee45b8b9d4f67a056cbdd17c1217117c5a8a4d83e8c2be87239658ec1fdef" exitCode=0 Apr 16 18:10:14.116780 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.116641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lwp8t" event={"ID":"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9","Type":"ContainerDied","Data":"51cee45b8b9d4f67a056cbdd17c1217117c5a8a4d83e8c2be87239658ec1fdef"} Apr 16 18:10:14.171175 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.171146 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-43.ec2.internal" event="NodeReady" Apr 16 18:10:14.171899 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.171311 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:10:14.236558 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.236524 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mj8zq"] Apr 16 18:10:14.271085 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.271054 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2zvsx"] Apr 16 18:10:14.271258 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.271218 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.273637 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.273611 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:10:14.273761 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.273671 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bnqh9\"" Apr 16 18:10:14.273761 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.273617 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:10:14.292051 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.292020 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2zvsx"] Apr 16 18:10:14.292051 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.292058 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mj8zq"] Apr 16 18:10:14.292261 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.292170 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:10:14.294871 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.294745 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:10:14.294871 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.294779 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:10:14.294871 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.294795 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:10:14.294871 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.294840 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ffhs9\"" Apr 16 18:10:14.362350 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.362317 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:10:14.362505 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.362355 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/befca98d-bc99-4ce7-82eb-9f84457dc655-tmp-dir\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.362505 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.362391 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcm7d\" (UniqueName: \"kubernetes.io/projected/e4c20834-fffd-49b6-be94-da4be1bc80a8-kube-api-access-bcm7d\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:10:14.362505 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.362415 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.362505 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.362443 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/befca98d-bc99-4ce7-82eb-9f84457dc655-config-volume\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.362665 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.362508 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st6gs\" (UniqueName: \"kubernetes.io/projected/befca98d-bc99-4ce7-82eb-9f84457dc655-kube-api-access-st6gs\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.463648 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.463609 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:10:14.463798 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.463656 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/befca98d-bc99-4ce7-82eb-9f84457dc655-tmp-dir\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.463798 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.463676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcm7d\" (UniqueName: \"kubernetes.io/projected/e4c20834-fffd-49b6-be94-da4be1bc80a8-kube-api-access-bcm7d\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:10:14.463798 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.463697 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.463798 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.463769 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:14.463798 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.463792 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:14.464071 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.463860 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert podName:e4c20834-fffd-49b6-be94-da4be1bc80a8 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:14.963820821 +0000 UTC m=+34.619055669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert") pod "ingress-canary-2zvsx" (UID: "e4c20834-fffd-49b6-be94-da4be1bc80a8") : secret "canary-serving-cert" not found Apr 16 18:10:14.464071 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.463882 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/befca98d-bc99-4ce7-82eb-9f84457dc655-config-volume\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.464071 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.463925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-st6gs\" (UniqueName: \"kubernetes.io/projected/befca98d-bc99-4ce7-82eb-9f84457dc655-kube-api-access-st6gs\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.464071 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.463947 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls podName:befca98d-bc99-4ce7-82eb-9f84457dc655 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:14.963929376 +0000 UTC m=+34.619164217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls") pod "dns-default-mj8zq" (UID: "befca98d-bc99-4ce7-82eb-9f84457dc655") : secret "dns-default-metrics-tls" not found Apr 16 18:10:14.464071 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.464009 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/befca98d-bc99-4ce7-82eb-9f84457dc655-tmp-dir\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.464350 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.464330 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/befca98d-bc99-4ce7-82eb-9f84457dc655-config-volume\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.477048 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.477020 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-st6gs\" (UniqueName: \"kubernetes.io/projected/befca98d-bc99-4ce7-82eb-9f84457dc655-kube-api-access-st6gs\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.477101 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.477071 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcm7d\" (UniqueName: \"kubernetes.io/projected/e4c20834-fffd-49b6-be94-da4be1bc80a8-kube-api-access-bcm7d\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:10:14.565374 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.565331 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:14.565540 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.565453 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:14.565540 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.565514 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret podName:15fcda3c-2ebe-475b-bd0f-7c9f1ed74875 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:46.565501552 +0000 UTC m=+66.220736382 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret") pod "global-pull-secret-syncer-sx24k" (UID: "15fcda3c-2ebe-475b-bd0f-7c9f1ed74875") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:14.666221 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.666183 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:10:14.666340 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.666285 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-946x6\" (UniqueName: \"kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6\") pod \"network-check-target-fswkr\" (UID: \"4b2b1c61-3440-4f11-9320-0d47781218e5\") " pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:14.666379 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.666341 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:14.666412 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.666387 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:14.666412 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.666403 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs podName:182ef3ca-8527-40a2-b1a7-c714bd3509c5 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:46.666388472 +0000 UTC m=+66.321623303 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs") pod "network-metrics-daemon-kgtvr" (UID: "182ef3ca-8527-40a2-b1a7-c714bd3509c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:14.666412 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.666405 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:14.666520 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.666418 2574 projected.go:194] Error preparing data for projected volume kube-api-access-946x6 for pod openshift-network-diagnostics/network-check-target-fswkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:14.666520 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.666455 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6 podName:4b2b1c61-3440-4f11-9320-0d47781218e5 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:46.666443605 +0000 UTC m=+66.321678436 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-946x6" (UniqueName: "kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6") pod "network-check-target-fswkr" (UID: "4b2b1c61-3440-4f11-9320-0d47781218e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:14.933545 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.933509 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:10:14.933729 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.933509 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:14.937066 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.937043 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:10:14.937220 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.937141 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kjmpp\"" Apr 16 18:10:14.937449 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.937433 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:10:14.968913 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.968870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:10:14.968913 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:14.968917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:14.969140 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.969015 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:14.969140 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.969068 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls podName:befca98d-bc99-4ce7-82eb-9f84457dc655 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.969054063 +0000 UTC m=+35.624288893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls") pod "dns-default-mj8zq" (UID: "befca98d-bc99-4ce7-82eb-9f84457dc655") : secret "dns-default-metrics-tls" not found Apr 16 18:10:14.969140 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.969015 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:14.969244 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:14.969159 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert podName:e4c20834-fffd-49b6-be94-da4be1bc80a8 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.969144834 +0000 UTC m=+35.624379665 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert") pod "ingress-canary-2zvsx" (UID: "e4c20834-fffd-49b6-be94-da4be1bc80a8") : secret "canary-serving-cert" not found Apr 16 18:10:15.120794 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:15.120758 2574 generic.go:358] "Generic (PLEG): container finished" podID="a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9" containerID="bc49497e15344bab675781dfd5efa37c6ac221ac932e3fdc53da2b0c668ad079" exitCode=0 Apr 16 18:10:15.120975 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:15.120819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lwp8t" event={"ID":"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9","Type":"ContainerDied","Data":"bc49497e15344bab675781dfd5efa37c6ac221ac932e3fdc53da2b0c668ad079"} Apr 16 18:10:15.933063 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:15.933025 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:15.936006 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:15.935978 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:10:15.936006 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:15.936001 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:10:15.937015 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:15.937000 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bnhf5\"" Apr 16 18:10:15.977842 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:15.977780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:10:15.978021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:15.977850 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:15.978021 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:15.977940 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:15.978021 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:15.978001 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:15.978147 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:15.978005 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert podName:e4c20834-fffd-49b6-be94-da4be1bc80a8 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.977988359 +0000 UTC m=+37.633223189 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert") pod "ingress-canary-2zvsx" (UID: "e4c20834-fffd-49b6-be94-da4be1bc80a8") : secret "canary-serving-cert" not found Apr 16 18:10:15.978147 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:15.978061 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls podName:befca98d-bc99-4ce7-82eb-9f84457dc655 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.978044728 +0000 UTC m=+37.633279559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls") pod "dns-default-mj8zq" (UID: "befca98d-bc99-4ce7-82eb-9f84457dc655") : secret "dns-default-metrics-tls" not found Apr 16 18:10:16.125856 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:16.125804 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lwp8t" event={"ID":"a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9","Type":"ContainerStarted","Data":"1ed1e7f1924a7d6725b9c8dfbf7e24facb49025de52cea768b9a5bd0e28998ae"} Apr 16 18:10:16.153409 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:16.153353 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lwp8t" podStartSLOduration=5.113581914 podStartE2EDuration="35.153299324s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.527782812 +0000 UTC m=+3.183017643" lastFinishedPulling="2026-04-16 18:10:13.567500207 +0000 UTC m=+33.222735053" observedRunningTime="2026-04-16 18:10:16.151961831 +0000 UTC m=+35.807196710" watchObservedRunningTime="2026-04-16 18:10:16.153299324 +0000 UTC m=+35.808534178" Apr 16 18:10:17.995580 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:17.995365 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:10:17.995970 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:17.995596 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:17.995970 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:17.995529 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:17.995970 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:17.995691 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:17.995970 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:17.995717 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert podName:e4c20834-fffd-49b6-be94-da4be1bc80a8 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:21.995700583 +0000 UTC m=+41.650935413 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert") pod "ingress-canary-2zvsx" (UID: "e4c20834-fffd-49b6-be94-da4be1bc80a8") : secret "canary-serving-cert" not found Apr 16 18:10:17.995970 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:17.995744 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls podName:befca98d-bc99-4ce7-82eb-9f84457dc655 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:21.995733322 +0000 UTC m=+41.650968153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls") pod "dns-default-mj8zq" (UID: "befca98d-bc99-4ce7-82eb-9f84457dc655") : secret "dns-default-metrics-tls" not found Apr 16 18:10:22.027256 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:22.027210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:10:22.027256 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:22.027257 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:22.027783 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:22.027349 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:22.027783 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:22.027377 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:22.027783 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:22.027404 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert podName:e4c20834-fffd-49b6-be94-da4be1bc80a8 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:30.027389771 +0000 UTC m=+49.682624603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert") pod "ingress-canary-2zvsx" (UID: "e4c20834-fffd-49b6-be94-da4be1bc80a8") : secret "canary-serving-cert" not found Apr 16 18:10:22.027783 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:22.027420 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls podName:befca98d-bc99-4ce7-82eb-9f84457dc655 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:30.027409963 +0000 UTC m=+49.682644794 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls") pod "dns-default-mj8zq" (UID: "befca98d-bc99-4ce7-82eb-9f84457dc655") : secret "dns-default-metrics-tls" not found Apr 16 18:10:30.084742 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:30.084701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:10:30.084742 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:30.084746 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:30.085265 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:30.084865 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:30.085265 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:30.084868 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:30.085265 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:30.084921 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert podName:e4c20834-fffd-49b6-be94-da4be1bc80a8 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:46.084905447 +0000 UTC m=+65.740140278 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert") pod "ingress-canary-2zvsx" (UID: "e4c20834-fffd-49b6-be94-da4be1bc80a8") : secret "canary-serving-cert" not found Apr 16 18:10:30.085265 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:30.084947 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls podName:befca98d-bc99-4ce7-82eb-9f84457dc655 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:46.08493565 +0000 UTC m=+65.740170481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls") pod "dns-default-mj8zq" (UID: "befca98d-bc99-4ce7-82eb-9f84457dc655") : secret "dns-default-metrics-tls" not found Apr 16 18:10:41.965798 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:41.965770 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wt28v" Apr 16 18:10:46.103085 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.103032 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:10:46.103452 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.103164 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:10:46.103452 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:46.103209 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:46.103452 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:46.103252 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:46.103452 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:46.103302 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls podName:befca98d-bc99-4ce7-82eb-9f84457dc655 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:18.103278988 +0000 UTC m=+97.758513858 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls") pod "dns-default-mj8zq" (UID: "befca98d-bc99-4ce7-82eb-9f84457dc655") : secret "dns-default-metrics-tls" not found Apr 16 18:10:46.103452 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:46.103323 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert podName:e4c20834-fffd-49b6-be94-da4be1bc80a8 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:18.103314275 +0000 UTC m=+97.758549110 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert") pod "ingress-canary-2zvsx" (UID: "e4c20834-fffd-49b6-be94-da4be1bc80a8") : secret "canary-serving-cert" not found Apr 16 18:10:46.606361 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.606298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:46.609102 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.609081 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:10:46.620291 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.620264 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15fcda3c-2ebe-475b-bd0f-7c9f1ed74875-original-pull-secret\") pod \"global-pull-secret-syncer-sx24k\" (UID: \"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875\") " pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:46.706967 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.706933 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-946x6\" (UniqueName: \"kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6\") pod \"network-check-target-fswkr\" (UID: \"4b2b1c61-3440-4f11-9320-0d47781218e5\") " pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:46.706967 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.706980 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:10:46.709708 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.709689 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:10:46.709782 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.709695 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:10:46.718104 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:46.718072 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:10:46.718265 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:10:46.718171 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs podName:182ef3ca-8527-40a2-b1a7-c714bd3509c5 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:50.718141621 +0000 UTC m=+130.373376456 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs") pod "network-metrics-daemon-kgtvr" (UID: "182ef3ca-8527-40a2-b1a7-c714bd3509c5") : secret "metrics-daemon-secret" not found Apr 16 18:10:46.720290 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.720270 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:10:46.730916 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.730876 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-946x6\" (UniqueName: \"kubernetes.io/projected/4b2b1c61-3440-4f11-9320-0d47781218e5-kube-api-access-946x6\") pod \"network-check-target-fswkr\" (UID: \"4b2b1c61-3440-4f11-9320-0d47781218e5\") " pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:46.747940 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.747905 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sx24k" Apr 16 18:10:46.844867 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.844794 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bnhf5\"" Apr 16 18:10:46.853201 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.853167 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:46.885528 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.885474 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-sx24k"] Apr 16 18:10:46.890517 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:10:46.889733 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15fcda3c_2ebe_475b_bd0f_7c9f1ed74875.slice/crio-a0fd43854f718ed78dba50f2cfa798fc8544f09b21f9ad6b1e1fe4df178b6901 WatchSource:0}: Error finding container a0fd43854f718ed78dba50f2cfa798fc8544f09b21f9ad6b1e1fe4df178b6901: Status 404 returned error can't find the container with id a0fd43854f718ed78dba50f2cfa798fc8544f09b21f9ad6b1e1fe4df178b6901 Apr 16 18:10:46.995364 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:46.995329 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fswkr"] Apr 16 18:10:46.998534 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:10:46.998503 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b2b1c61_3440_4f11_9320_0d47781218e5.slice/crio-6b636d6edaaa4ff018ecb5d9efe7390ec5fa2f5097ed8e2ccf6252583a4f039f WatchSource:0}: Error finding container 6b636d6edaaa4ff018ecb5d9efe7390ec5fa2f5097ed8e2ccf6252583a4f039f: Status 404 returned error can't find the container with id 6b636d6edaaa4ff018ecb5d9efe7390ec5fa2f5097ed8e2ccf6252583a4f039f Apr 16 18:10:47.188793 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:47.188759 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fswkr" event={"ID":"4b2b1c61-3440-4f11-9320-0d47781218e5","Type":"ContainerStarted","Data":"6b636d6edaaa4ff018ecb5d9efe7390ec5fa2f5097ed8e2ccf6252583a4f039f"} Apr 16 18:10:47.189806 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:47.189779 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-sx24k" event={"ID":"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875","Type":"ContainerStarted","Data":"a0fd43854f718ed78dba50f2cfa798fc8544f09b21f9ad6b1e1fe4df178b6901"} Apr 16 18:10:51.939531 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:51.939493 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4"] Apr 16 18:10:51.942389 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:51.942369 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4" Apr 16 18:10:51.946173 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:51.946140 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:10:51.947549 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:51.947463 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 18:10:51.947549 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:51.947493 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-85p52\"" Apr 16 18:10:51.947549 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:51.947504 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:10:51.947549 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:51.947505 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:10:51.953395 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:51.953367 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4"] Apr 16 18:10:51.974852 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:51.974807 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt"] Apr 16 18:10:51.977501 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:51.977485 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:51.980273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:51.980254 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 18:10:51.985940 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:51.985915 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt"] Apr 16 18:10:52.047607 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.047563 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvx7b\" (UniqueName: \"kubernetes.io/projected/468e41ab-46bf-4020-b40c-318aafbb9eee-kube-api-access-xvx7b\") pod \"managed-serviceaccount-addon-agent-85479b4b77-lsgx4\" (UID: \"468e41ab-46bf-4020-b40c-318aafbb9eee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4" Apr 16 18:10:52.047607 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.047605 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa9abd85-b468-44db-95b1-d394c1fb105a-tmp\") pod \"klusterlet-addon-workmgr-54f49bd968-qqlvt\" (UID: \"fa9abd85-b468-44db-95b1-d394c1fb105a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:52.047895 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.047674 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/468e41ab-46bf-4020-b40c-318aafbb9eee-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-85479b4b77-lsgx4\" (UID: \"468e41ab-46bf-4020-b40c-318aafbb9eee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4" Apr 16 18:10:52.047895 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.047736 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6gdd\" (UniqueName: \"kubernetes.io/projected/fa9abd85-b468-44db-95b1-d394c1fb105a-kube-api-access-d6gdd\") pod \"klusterlet-addon-workmgr-54f49bd968-qqlvt\" (UID: \"fa9abd85-b468-44db-95b1-d394c1fb105a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:52.047895 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.047761 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/fa9abd85-b468-44db-95b1-d394c1fb105a-klusterlet-config\") pod \"klusterlet-addon-workmgr-54f49bd968-qqlvt\" (UID: \"fa9abd85-b468-44db-95b1-d394c1fb105a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:52.149039 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.148990 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/468e41ab-46bf-4020-b40c-318aafbb9eee-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-85479b4b77-lsgx4\" (UID: \"468e41ab-46bf-4020-b40c-318aafbb9eee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4" Apr 16 18:10:52.149199 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.149080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6gdd\" (UniqueName: \"kubernetes.io/projected/fa9abd85-b468-44db-95b1-d394c1fb105a-kube-api-access-d6gdd\") pod \"klusterlet-addon-workmgr-54f49bd968-qqlvt\" (UID: \"fa9abd85-b468-44db-95b1-d394c1fb105a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:52.149199 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.149110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/fa9abd85-b468-44db-95b1-d394c1fb105a-klusterlet-config\") pod \"klusterlet-addon-workmgr-54f49bd968-qqlvt\" (UID: \"fa9abd85-b468-44db-95b1-d394c1fb105a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:52.149199 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.149153 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvx7b\" (UniqueName: \"kubernetes.io/projected/468e41ab-46bf-4020-b40c-318aafbb9eee-kube-api-access-xvx7b\") pod \"managed-serviceaccount-addon-agent-85479b4b77-lsgx4\" (UID: \"468e41ab-46bf-4020-b40c-318aafbb9eee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4" Apr 16 18:10:52.149199 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.149176 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa9abd85-b468-44db-95b1-d394c1fb105a-tmp\") pod \"klusterlet-addon-workmgr-54f49bd968-qqlvt\" (UID: \"fa9abd85-b468-44db-95b1-d394c1fb105a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:52.149577 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.149555 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa9abd85-b468-44db-95b1-d394c1fb105a-tmp\") pod \"klusterlet-addon-workmgr-54f49bd968-qqlvt\" (UID: \"fa9abd85-b468-44db-95b1-d394c1fb105a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:52.151735 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.151709 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/fa9abd85-b468-44db-95b1-d394c1fb105a-klusterlet-config\") pod \"klusterlet-addon-workmgr-54f49bd968-qqlvt\" (UID: \"fa9abd85-b468-44db-95b1-d394c1fb105a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:52.151854 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.151735 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/468e41ab-46bf-4020-b40c-318aafbb9eee-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-85479b4b77-lsgx4\" (UID: \"468e41ab-46bf-4020-b40c-318aafbb9eee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4" Apr 16 18:10:52.161078 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.161048 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6gdd\" (UniqueName: \"kubernetes.io/projected/fa9abd85-b468-44db-95b1-d394c1fb105a-kube-api-access-d6gdd\") pod \"klusterlet-addon-workmgr-54f49bd968-qqlvt\" (UID: \"fa9abd85-b468-44db-95b1-d394c1fb105a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:52.161078 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.161076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvx7b\" (UniqueName: \"kubernetes.io/projected/468e41ab-46bf-4020-b40c-318aafbb9eee-kube-api-access-xvx7b\") pod \"managed-serviceaccount-addon-agent-85479b4b77-lsgx4\" (UID: \"468e41ab-46bf-4020-b40c-318aafbb9eee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4" Apr 16 18:10:52.202299 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.202193 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fswkr" event={"ID":"4b2b1c61-3440-4f11-9320-0d47781218e5","Type":"ContainerStarted","Data":"e65cf4b3678231836acbe72783c7bc37a92ed38519a3dfc8b03ef55c5d70c18f"} Apr 16 18:10:52.202299 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.202283 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:10:52.203472 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.203450 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-sx24k" event={"ID":"15fcda3c-2ebe-475b-bd0f-7c9f1ed74875","Type":"ContainerStarted","Data":"67b5c8defda37d956dcf569bdd90e8953b1cccc06f60fdac464ce23b51037310"} Apr 16 18:10:52.220943 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.220893 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fswkr" podStartSLOduration=67.869426532 podStartE2EDuration="1m12.220879047s" podCreationTimestamp="2026-04-16 18:09:40 +0000 UTC" firstStartedPulling="2026-04-16 18:10:47.006405167 +0000 UTC m=+66.661639998" lastFinishedPulling="2026-04-16 18:10:51.357857679 +0000 UTC m=+71.013092513" observedRunningTime="2026-04-16 18:10:52.220261935 +0000 UTC m=+71.875496789" watchObservedRunningTime="2026-04-16 18:10:52.220879047 +0000 UTC m=+71.876113900" Apr 16 18:10:52.237530 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.237483 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-sx24k" podStartSLOduration=66.768216011 podStartE2EDuration="1m11.237470159s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:10:46.891913156 +0000 UTC m=+66.547147986" lastFinishedPulling="2026-04-16 18:10:51.361167302 +0000 UTC m=+71.016402134" observedRunningTime="2026-04-16 18:10:52.237325962 +0000 UTC m=+71.892560816" watchObservedRunningTime="2026-04-16 18:10:52.237470159 +0000 UTC m=+71.892705011" Apr 16 18:10:52.260981 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.260943 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4" Apr 16 18:10:52.286838 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.286779 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:52.394151 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.394113 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4"] Apr 16 18:10:52.398572 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:10:52.398538 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod468e41ab_46bf_4020_b40c_318aafbb9eee.slice/crio-14dad19c0816f1da698124a22a84d1a431d7e3501969155a42d9da24abfde4af WatchSource:0}: Error finding container 14dad19c0816f1da698124a22a84d1a431d7e3501969155a42d9da24abfde4af: Status 404 returned error can't find the container with id 14dad19c0816f1da698124a22a84d1a431d7e3501969155a42d9da24abfde4af Apr 16 18:10:52.423149 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:52.423117 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt"] Apr 16 18:10:52.426635 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:10:52.426609 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa9abd85_b468_44db_95b1_d394c1fb105a.slice/crio-ff8dda3d2ac1d6f453770d8c62372a8bb27dd6f514a637ab395682e51a8c7f02 WatchSource:0}: Error finding container ff8dda3d2ac1d6f453770d8c62372a8bb27dd6f514a637ab395682e51a8c7f02: Status 404 returned error can't find the container with id ff8dda3d2ac1d6f453770d8c62372a8bb27dd6f514a637ab395682e51a8c7f02 Apr 16 18:10:53.207184 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:53.207142 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" event={"ID":"fa9abd85-b468-44db-95b1-d394c1fb105a","Type":"ContainerStarted","Data":"ff8dda3d2ac1d6f453770d8c62372a8bb27dd6f514a637ab395682e51a8c7f02"} Apr 16 18:10:53.208387 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:53.208356 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4" event={"ID":"468e41ab-46bf-4020-b40c-318aafbb9eee","Type":"ContainerStarted","Data":"14dad19c0816f1da698124a22a84d1a431d7e3501969155a42d9da24abfde4af"} Apr 16 18:10:55.213408 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:55.213374 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4" event={"ID":"468e41ab-46bf-4020-b40c-318aafbb9eee","Type":"ContainerStarted","Data":"e483c4c18bf7210edeab905a433674f5f4a284afeb4e28dfe489479060b67c82"} Apr 16 18:10:55.232123 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:55.232047 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85479b4b77-lsgx4" podStartSLOduration=1.804931187 podStartE2EDuration="4.232025973s" podCreationTimestamp="2026-04-16 18:10:51 +0000 UTC" firstStartedPulling="2026-04-16 18:10:52.400898509 +0000 UTC m=+72.056133355" lastFinishedPulling="2026-04-16 18:10:54.827993306 +0000 UTC m=+74.483228141" observedRunningTime="2026-04-16 18:10:55.231394907 +0000 UTC m=+74.886629759" watchObservedRunningTime="2026-04-16 18:10:55.232025973 +0000 UTC m=+74.887260827" Apr 16 18:10:57.218193 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:57.218148 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" event={"ID":"fa9abd85-b468-44db-95b1-d394c1fb105a","Type":"ContainerStarted","Data":"36588db6f7269e397db9d97ae46bc8223af38c05d9ab3493db1c656814b196d5"} Apr 16 18:10:57.221485 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:57.218718 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:57.222237 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:57.222215 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" Apr 16 18:10:57.238018 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:10:57.237965 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54f49bd968-qqlvt" podStartSLOduration=2.228508448 podStartE2EDuration="6.237950623s" podCreationTimestamp="2026-04-16 18:10:51 +0000 UTC" firstStartedPulling="2026-04-16 18:10:52.428448109 +0000 UTC m=+72.083682941" lastFinishedPulling="2026-04-16 18:10:56.437890281 +0000 UTC m=+76.093125116" observedRunningTime="2026-04-16 18:10:57.236603137 +0000 UTC m=+76.891838015" watchObservedRunningTime="2026-04-16 18:10:57.237950623 +0000 UTC m=+76.893185476" Apr 16 18:11:18.128510 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:18.128446 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:11:18.128510 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:18.128516 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:11:18.129090 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:11:18.128611 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:18.129090 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:11:18.128656 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:18.129090 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:11:18.128690 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert podName:e4c20834-fffd-49b6-be94-da4be1bc80a8 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:22.128674968 +0000 UTC m=+161.783909798 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert") pod "ingress-canary-2zvsx" (UID: "e4c20834-fffd-49b6-be94-da4be1bc80a8") : secret "canary-serving-cert" not found Apr 16 18:11:18.129090 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:11:18.128719 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls podName:befca98d-bc99-4ce7-82eb-9f84457dc655 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:22.128704911 +0000 UTC m=+161.783939758 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls") pod "dns-default-mj8zq" (UID: "befca98d-bc99-4ce7-82eb-9f84457dc655") : secret "dns-default-metrics-tls" not found Apr 16 18:11:23.211361 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:23.211322 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fswkr" Apr 16 18:11:50.761517 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:50.761453 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:11:50.762037 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:11:50.761603 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:11:50.762037 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:11:50.761683 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs podName:182ef3ca-8527-40a2-b1a7-c714bd3509c5 nodeName:}" failed. No retries permitted until 2026-04-16 18:13:52.761665053 +0000 UTC m=+252.416899927 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs") pod "network-metrics-daemon-kgtvr" (UID: "182ef3ca-8527-40a2-b1a7-c714bd3509c5") : secret "metrics-daemon-secret" not found Apr 16 18:11:55.772266 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.772223 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-4lsvd"] Apr 16 18:11:55.775263 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.775239 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:55.778212 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.778184 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:11:55.778365 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.778321 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:55.778436 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.778325 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:11:55.779514 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.779495 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:11:55.779646 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.779627 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-nk44x\"" Apr 16 18:11:55.784619 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.784598 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:11:55.801490 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.801459 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-4lsvd"] Apr 16 18:11:55.876054 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.876010 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl"] Apr 16 18:11:55.878616 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.878598 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-wjllt"] Apr 16 18:11:55.878761 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.878745 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" Apr 16 18:11:55.881722 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.881692 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 18:11:55.881921 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.881898 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 18:11:55.882015 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.881779 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-wjllt" Apr 16 18:11:55.882130 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.881912 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:55.882263 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.882231 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-6kj94\"" Apr 16 18:11:55.885564 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.882597 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 18:11:55.885564 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.884806 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mc7jw\"" Apr 16 18:11:55.891225 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.891200 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-wjllt"] Apr 16 18:11:55.892095 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.892071 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl"] Apr 16 18:11:55.896955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.896931 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1be4b879-19c7-4497-badb-3f90683cdd48-serving-cert\") pod \"console-operator-d87b8d5fc-4lsvd\" (UID: \"1be4b879-19c7-4497-badb-3f90683cdd48\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:55.897074 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.896966 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1be4b879-19c7-4497-badb-3f90683cdd48-config\") pod \"console-operator-d87b8d5fc-4lsvd\" (UID: \"1be4b879-19c7-4497-badb-3f90683cdd48\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:55.897074 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.896992 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdvxn\" (UniqueName: \"kubernetes.io/projected/1be4b879-19c7-4497-badb-3f90683cdd48-kube-api-access-tdvxn\") pod \"console-operator-d87b8d5fc-4lsvd\" (UID: \"1be4b879-19c7-4497-badb-3f90683cdd48\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:55.897192 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.897164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1be4b879-19c7-4497-badb-3f90683cdd48-trusted-ca\") pod \"console-operator-d87b8d5fc-4lsvd\" (UID: \"1be4b879-19c7-4497-badb-3f90683cdd48\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:55.997844 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.997781 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1be4b879-19c7-4497-badb-3f90683cdd48-trusted-ca\") pod \"console-operator-d87b8d5fc-4lsvd\" (UID: \"1be4b879-19c7-4497-badb-3f90683cdd48\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:55.997844 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.997845 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e47f34ff-4372-4215-ba86-42576df70e3d-serving-cert\") pod \"service-ca-operator-69965bb79d-r45bl\" (UID: \"e47f34ff-4372-4215-ba86-42576df70e3d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" Apr 16 18:11:55.998053 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.998008 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1be4b879-19c7-4497-badb-3f90683cdd48-serving-cert\") pod \"console-operator-d87b8d5fc-4lsvd\" (UID: \"1be4b879-19c7-4497-badb-3f90683cdd48\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:55.998094 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.998049 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1be4b879-19c7-4497-badb-3f90683cdd48-config\") pod \"console-operator-d87b8d5fc-4lsvd\" (UID: \"1be4b879-19c7-4497-badb-3f90683cdd48\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:55.998094 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.998083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnpk6\" (UniqueName: \"kubernetes.io/projected/dd3fb5f9-97ce-4755-a871-397dee971b05-kube-api-access-nnpk6\") pod \"network-check-source-7b678d77c7-wjllt\" (UID: \"dd3fb5f9-97ce-4755-a871-397dee971b05\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-wjllt" Apr 16 18:11:55.998157 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.998120 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdvxn\" (UniqueName: \"kubernetes.io/projected/1be4b879-19c7-4497-badb-3f90683cdd48-kube-api-access-tdvxn\") pod \"console-operator-d87b8d5fc-4lsvd\" (UID: \"1be4b879-19c7-4497-badb-3f90683cdd48\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:55.998255 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.998220 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e47f34ff-4372-4215-ba86-42576df70e3d-config\") pod \"service-ca-operator-69965bb79d-r45bl\" (UID: \"e47f34ff-4372-4215-ba86-42576df70e3d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" Apr 16 18:11:55.998382 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.998362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26s9h\" (UniqueName: \"kubernetes.io/projected/e47f34ff-4372-4215-ba86-42576df70e3d-kube-api-access-26s9h\") pod \"service-ca-operator-69965bb79d-r45bl\" (UID: \"e47f34ff-4372-4215-ba86-42576df70e3d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" Apr 16 18:11:55.998742 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.998716 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1be4b879-19c7-4497-badb-3f90683cdd48-config\") pod \"console-operator-d87b8d5fc-4lsvd\" (UID: \"1be4b879-19c7-4497-badb-3f90683cdd48\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:55.998873 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:55.998718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1be4b879-19c7-4497-badb-3f90683cdd48-trusted-ca\") pod \"console-operator-d87b8d5fc-4lsvd\" (UID: \"1be4b879-19c7-4497-badb-3f90683cdd48\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:56.000939 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.000917 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1be4b879-19c7-4497-badb-3f90683cdd48-serving-cert\") pod \"console-operator-d87b8d5fc-4lsvd\" (UID: \"1be4b879-19c7-4497-badb-3f90683cdd48\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:56.019646 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.019605 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdvxn\" (UniqueName: \"kubernetes.io/projected/1be4b879-19c7-4497-badb-3f90683cdd48-kube-api-access-tdvxn\") pod \"console-operator-d87b8d5fc-4lsvd\" (UID: \"1be4b879-19c7-4497-badb-3f90683cdd48\") " pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:56.084813 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.084695 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:11:56.099220 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.099166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e47f34ff-4372-4215-ba86-42576df70e3d-serving-cert\") pod \"service-ca-operator-69965bb79d-r45bl\" (UID: \"e47f34ff-4372-4215-ba86-42576df70e3d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" Apr 16 18:11:56.099388 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.099294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnpk6\" (UniqueName: \"kubernetes.io/projected/dd3fb5f9-97ce-4755-a871-397dee971b05-kube-api-access-nnpk6\") pod \"network-check-source-7b678d77c7-wjllt\" (UID: \"dd3fb5f9-97ce-4755-a871-397dee971b05\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-wjllt" Apr 16 18:11:56.099458 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.099401 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e47f34ff-4372-4215-ba86-42576df70e3d-config\") pod \"service-ca-operator-69965bb79d-r45bl\" (UID: \"e47f34ff-4372-4215-ba86-42576df70e3d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" Apr 16 18:11:56.099458 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.099442 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26s9h\" (UniqueName: \"kubernetes.io/projected/e47f34ff-4372-4215-ba86-42576df70e3d-kube-api-access-26s9h\") pod \"service-ca-operator-69965bb79d-r45bl\" (UID: \"e47f34ff-4372-4215-ba86-42576df70e3d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" Apr 16 18:11:56.100269 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.100233 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e47f34ff-4372-4215-ba86-42576df70e3d-config\") pod \"service-ca-operator-69965bb79d-r45bl\" (UID: \"e47f34ff-4372-4215-ba86-42576df70e3d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" Apr 16 18:11:56.102301 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.102273 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e47f34ff-4372-4215-ba86-42576df70e3d-serving-cert\") pod \"service-ca-operator-69965bb79d-r45bl\" (UID: \"e47f34ff-4372-4215-ba86-42576df70e3d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" Apr 16 18:11:56.116803 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.116771 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnpk6\" (UniqueName: \"kubernetes.io/projected/dd3fb5f9-97ce-4755-a871-397dee971b05-kube-api-access-nnpk6\") pod \"network-check-source-7b678d77c7-wjllt\" (UID: \"dd3fb5f9-97ce-4755-a871-397dee971b05\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-wjllt" Apr 16 18:11:56.118364 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.118336 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26s9h\" (UniqueName: \"kubernetes.io/projected/e47f34ff-4372-4215-ba86-42576df70e3d-kube-api-access-26s9h\") pod \"service-ca-operator-69965bb79d-r45bl\" (UID: \"e47f34ff-4372-4215-ba86-42576df70e3d\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" Apr 16 18:11:56.192249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.192209 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" Apr 16 18:11:56.197117 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.197082 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-wjllt" Apr 16 18:11:56.217280 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.217235 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-4lsvd"] Apr 16 18:11:56.220404 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:11:56.220364 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1be4b879_19c7_4497_badb_3f90683cdd48.slice/crio-a0619f5469af4b8ec95219d5acf3bdb078e0b5dda5f84171c3b54e0bf73afd9b WatchSource:0}: Error finding container a0619f5469af4b8ec95219d5acf3bdb078e0b5dda5f84171c3b54e0bf73afd9b: Status 404 returned error can't find the container with id a0619f5469af4b8ec95219d5acf3bdb078e0b5dda5f84171c3b54e0bf73afd9b Apr 16 18:11:56.331745 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.331711 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" event={"ID":"1be4b879-19c7-4497-badb-3f90683cdd48","Type":"ContainerStarted","Data":"a0619f5469af4b8ec95219d5acf3bdb078e0b5dda5f84171c3b54e0bf73afd9b"} Apr 16 18:11:56.334489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.334461 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl"] Apr 16 18:11:56.337777 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:11:56.337747 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode47f34ff_4372_4215_ba86_42576df70e3d.slice/crio-15fc36b2c34686836eb74a40b40cf295586dee4dabf5ad10a5788c8a427fabce WatchSource:0}: Error finding container 15fc36b2c34686836eb74a40b40cf295586dee4dabf5ad10a5788c8a427fabce: Status 404 returned error can't find the container with id 15fc36b2c34686836eb74a40b40cf295586dee4dabf5ad10a5788c8a427fabce Apr 16 18:11:56.353441 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:56.353404 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-wjllt"] Apr 16 18:11:56.357053 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:11:56.357023 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd3fb5f9_97ce_4755_a871_397dee971b05.slice/crio-29e8b0ffdc68316eca055438f78da16ab98735d241ba5641f2a0466cf7e7554a WatchSource:0}: Error finding container 29e8b0ffdc68316eca055438f78da16ab98735d241ba5641f2a0466cf7e7554a: Status 404 returned error can't find the container with id 29e8b0ffdc68316eca055438f78da16ab98735d241ba5641f2a0466cf7e7554a Apr 16 18:11:57.336572 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:57.336505 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" event={"ID":"e47f34ff-4372-4215-ba86-42576df70e3d","Type":"ContainerStarted","Data":"15fc36b2c34686836eb74a40b40cf295586dee4dabf5ad10a5788c8a427fabce"} Apr 16 18:11:57.338485 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:57.338421 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-wjllt" event={"ID":"dd3fb5f9-97ce-4755-a871-397dee971b05","Type":"ContainerStarted","Data":"f11e4f49f3b2b4b5da20b1a0de1fd30532c137a9193f9cdae390ab753cd8cb21"} Apr 16 18:11:57.338485 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:57.338459 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-wjllt" event={"ID":"dd3fb5f9-97ce-4755-a871-397dee971b05","Type":"ContainerStarted","Data":"29e8b0ffdc68316eca055438f78da16ab98735d241ba5641f2a0466cf7e7554a"} Apr 16 18:11:57.360804 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:57.360740 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-wjllt" podStartSLOduration=2.360721458 podStartE2EDuration="2.360721458s" podCreationTimestamp="2026-04-16 18:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:57.358144074 +0000 UTC m=+137.013378954" watchObservedRunningTime="2026-04-16 18:11:57.360721458 +0000 UTC m=+137.015956313" Apr 16 18:11:59.349220 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:59.349187 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/0.log" Apr 16 18:11:59.349662 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:59.349233 2574 generic.go:358] "Generic (PLEG): container finished" podID="1be4b879-19c7-4497-badb-3f90683cdd48" containerID="2e136b6fbd5cfe3853e4f861055ac744290dccc602e107a3fa77b723f8cddb5e" exitCode=255 Apr 16 18:11:59.349662 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:59.349307 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" event={"ID":"1be4b879-19c7-4497-badb-3f90683cdd48","Type":"ContainerDied","Data":"2e136b6fbd5cfe3853e4f861055ac744290dccc602e107a3fa77b723f8cddb5e"} Apr 16 18:11:59.349662 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:59.349554 2574 scope.go:117] "RemoveContainer" containerID="2e136b6fbd5cfe3853e4f861055ac744290dccc602e107a3fa77b723f8cddb5e" Apr 16 18:11:59.350677 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:59.350653 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" event={"ID":"e47f34ff-4372-4215-ba86-42576df70e3d","Type":"ContainerStarted","Data":"1a8ada349354f4a1a58d9952cf2d9dcea4447326e68651e807ea22c51fa32199"} Apr 16 18:11:59.390050 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:11:59.390004 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" podStartSLOduration=2.155462548 podStartE2EDuration="4.389987785s" podCreationTimestamp="2026-04-16 18:11:55 +0000 UTC" firstStartedPulling="2026-04-16 18:11:56.339552651 +0000 UTC m=+135.994787483" lastFinishedPulling="2026-04-16 18:11:58.574077886 +0000 UTC m=+138.229312720" observedRunningTime="2026-04-16 18:11:59.38893608 +0000 UTC m=+139.044170962" watchObservedRunningTime="2026-04-16 18:11:59.389987785 +0000 UTC m=+139.045222637" Apr 16 18:12:00.354581 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:00.354552 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:12:00.355016 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:00.354965 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/0.log" Apr 16 18:12:00.355016 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:00.354998 2574 generic.go:358] "Generic (PLEG): container finished" podID="1be4b879-19c7-4497-badb-3f90683cdd48" containerID="b4b2b0de532de89ea931237a8322ec9320767c60bbc67dc7efa0dbd4bb175bff" exitCode=255 Apr 16 18:12:00.355117 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:00.355047 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" event={"ID":"1be4b879-19c7-4497-badb-3f90683cdd48","Type":"ContainerDied","Data":"b4b2b0de532de89ea931237a8322ec9320767c60bbc67dc7efa0dbd4bb175bff"} Apr 16 18:12:00.355117 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:00.355093 2574 scope.go:117] "RemoveContainer" containerID="2e136b6fbd5cfe3853e4f861055ac744290dccc602e107a3fa77b723f8cddb5e" Apr 16 18:12:00.355344 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:00.355331 2574 scope.go:117] "RemoveContainer" containerID="b4b2b0de532de89ea931237a8322ec9320767c60bbc67dc7efa0dbd4bb175bff" Apr 16 18:12:00.355580 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:12:00.355549 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-4lsvd_openshift-console-operator(1be4b879-19c7-4497-badb-3f90683cdd48)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" podUID="1be4b879-19c7-4497-badb-3f90683cdd48" Apr 16 18:12:01.358566 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:01.358539 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:12:01.358959 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:01.358902 2574 scope.go:117] "RemoveContainer" containerID="b4b2b0de532de89ea931237a8322ec9320767c60bbc67dc7efa0dbd4bb175bff" Apr 16 18:12:01.359075 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:12:01.359058 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-4lsvd_openshift-console-operator(1be4b879-19c7-4497-badb-3f90683cdd48)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" podUID="1be4b879-19c7-4497-badb-3f90683cdd48" Apr 16 18:12:02.834521 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:02.834449 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-qk7qt"] Apr 16 18:12:02.838786 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:02.838760 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" Apr 16 18:12:02.841294 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:02.841265 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:12:02.841405 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:02.841280 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-jfnth\"" Apr 16 18:12:02.842455 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:02.842426 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:12:02.842455 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:02.842452 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:12:02.842630 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:02.842464 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:12:02.846940 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:02.846915 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-qk7qt"] Apr 16 18:12:02.862815 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:02.862786 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b7tsj_25e589f1-86e1-42cd-a623-02b4361d82ee/dns-node-resolver/0.log" Apr 16 18:12:02.951084 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:02.951047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3cdceeb2-6a01-43c4-8d9b-def700f64b32-signing-key\") pod \"service-ca-bfc587fb7-qk7qt\" (UID: \"3cdceeb2-6a01-43c4-8d9b-def700f64b32\") " pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" Apr 16 18:12:02.951260 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:02.951100 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3cdceeb2-6a01-43c4-8d9b-def700f64b32-signing-cabundle\") pod \"service-ca-bfc587fb7-qk7qt\" (UID: \"3cdceeb2-6a01-43c4-8d9b-def700f64b32\") " pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" Apr 16 18:12:02.951260 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:02.951167 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctxst\" (UniqueName: \"kubernetes.io/projected/3cdceeb2-6a01-43c4-8d9b-def700f64b32-kube-api-access-ctxst\") pod \"service-ca-bfc587fb7-qk7qt\" (UID: \"3cdceeb2-6a01-43c4-8d9b-def700f64b32\") " pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" Apr 16 18:12:03.052604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:03.052549 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3cdceeb2-6a01-43c4-8d9b-def700f64b32-signing-cabundle\") pod \"service-ca-bfc587fb7-qk7qt\" (UID: \"3cdceeb2-6a01-43c4-8d9b-def700f64b32\") " pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" Apr 16 18:12:03.052604 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:03.052608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctxst\" (UniqueName: \"kubernetes.io/projected/3cdceeb2-6a01-43c4-8d9b-def700f64b32-kube-api-access-ctxst\") pod \"service-ca-bfc587fb7-qk7qt\" (UID: \"3cdceeb2-6a01-43c4-8d9b-def700f64b32\") " pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" Apr 16 18:12:03.052815 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:03.052661 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3cdceeb2-6a01-43c4-8d9b-def700f64b32-signing-key\") pod \"service-ca-bfc587fb7-qk7qt\" (UID: \"3cdceeb2-6a01-43c4-8d9b-def700f64b32\") " pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" Apr 16 18:12:03.053224 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:03.053202 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3cdceeb2-6a01-43c4-8d9b-def700f64b32-signing-cabundle\") pod \"service-ca-bfc587fb7-qk7qt\" (UID: \"3cdceeb2-6a01-43c4-8d9b-def700f64b32\") " pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" Apr 16 18:12:03.055071 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:03.055053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3cdceeb2-6a01-43c4-8d9b-def700f64b32-signing-key\") pod \"service-ca-bfc587fb7-qk7qt\" (UID: \"3cdceeb2-6a01-43c4-8d9b-def700f64b32\") " pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" Apr 16 18:12:03.062647 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:03.062617 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctxst\" (UniqueName: \"kubernetes.io/projected/3cdceeb2-6a01-43c4-8d9b-def700f64b32-kube-api-access-ctxst\") pod \"service-ca-bfc587fb7-qk7qt\" (UID: \"3cdceeb2-6a01-43c4-8d9b-def700f64b32\") " pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" Apr 16 18:12:03.148752 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:03.148717 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" Apr 16 18:12:03.273887 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:03.273848 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-qk7qt"] Apr 16 18:12:03.277246 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:12:03.277213 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cdceeb2_6a01_43c4_8d9b_def700f64b32.slice/crio-396042dfb8a6138563cac6066e670d029d2c514e7e73615fb8b0cd526c63fc37 WatchSource:0}: Error finding container 396042dfb8a6138563cac6066e670d029d2c514e7e73615fb8b0cd526c63fc37: Status 404 returned error can't find the container with id 396042dfb8a6138563cac6066e670d029d2c514e7e73615fb8b0cd526c63fc37 Apr 16 18:12:03.366339 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:03.366287 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" event={"ID":"3cdceeb2-6a01-43c4-8d9b-def700f64b32","Type":"ContainerStarted","Data":"d0bd40ee4a35d996ce36b98791eee01b83d68230cbeda8280b6775cd5e5fad48"} Apr 16 18:12:03.366339 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:03.366333 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" event={"ID":"3cdceeb2-6a01-43c4-8d9b-def700f64b32","Type":"ContainerStarted","Data":"396042dfb8a6138563cac6066e670d029d2c514e7e73615fb8b0cd526c63fc37"} Apr 16 18:12:03.386734 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:03.386685 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-qk7qt" podStartSLOduration=1.386664564 podStartE2EDuration="1.386664564s" podCreationTimestamp="2026-04-16 18:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:03.386611704 +0000 UTC m=+143.041846557" watchObservedRunningTime="2026-04-16 18:12:03.386664564 +0000 UTC m=+143.041899418" Apr 16 18:12:04.063193 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:04.063166 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fd7p9_8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0/node-ca/0.log" Apr 16 18:12:06.085425 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:06.085385 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:12:06.085425 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:06.085431 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:12:06.086008 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:06.085922 2574 scope.go:117] "RemoveContainer" containerID="b4b2b0de532de89ea931237a8322ec9320767c60bbc67dc7efa0dbd4bb175bff" Apr 16 18:12:06.086149 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:12:06.086128 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-4lsvd_openshift-console-operator(1be4b879-19c7-4497-badb-3f90683cdd48)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" podUID="1be4b879-19c7-4497-badb-3f90683cdd48" Apr 16 18:12:17.280583 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:12:17.280524 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mj8zq" podUID="befca98d-bc99-4ce7-82eb-9f84457dc655" Apr 16 18:12:17.301400 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:12:17.301371 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2zvsx" podUID="e4c20834-fffd-49b6-be94-da4be1bc80a8" Apr 16 18:12:17.406179 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:17.406149 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mj8zq" Apr 16 18:12:17.943612 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:12:17.943566 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-kgtvr" podUID="182ef3ca-8527-40a2-b1a7-c714bd3509c5" Apr 16 18:12:18.934179 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:18.934143 2574 scope.go:117] "RemoveContainer" containerID="b4b2b0de532de89ea931237a8322ec9320767c60bbc67dc7efa0dbd4bb175bff" Apr 16 18:12:19.413810 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:19.413783 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:12:19.413990 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:19.413889 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" event={"ID":"1be4b879-19c7-4497-badb-3f90683cdd48","Type":"ContainerStarted","Data":"795cf25f03b72bd1b32f3a3e23b1c32a6af5929d253c04505f7bcfb3a20c9418"} Apr 16 18:12:19.414168 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:19.414150 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:12:19.435448 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:19.435397 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" podStartSLOduration=22.087581 podStartE2EDuration="24.435382826s" podCreationTimestamp="2026-04-16 18:11:55 +0000 UTC" firstStartedPulling="2026-04-16 18:11:56.222735357 +0000 UTC m=+135.877970188" lastFinishedPulling="2026-04-16 18:11:58.57053718 +0000 UTC m=+138.225772014" observedRunningTime="2026-04-16 18:12:19.433516362 +0000 UTC m=+159.088751215" watchObservedRunningTime="2026-04-16 18:12:19.435382826 +0000 UTC m=+159.090617679" Apr 16 18:12:19.547336 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:19.547306 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-4lsvd" Apr 16 18:12:22.204661 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:22.204573 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:12:22.205135 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:22.204750 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:12:22.207057 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:22.207032 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/befca98d-bc99-4ce7-82eb-9f84457dc655-metrics-tls\") pod \"dns-default-mj8zq\" (UID: \"befca98d-bc99-4ce7-82eb-9f84457dc655\") " pod="openshift-dns/dns-default-mj8zq" Apr 16 18:12:22.207191 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:22.207170 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4c20834-fffd-49b6-be94-da4be1bc80a8-cert\") pod \"ingress-canary-2zvsx\" (UID: \"e4c20834-fffd-49b6-be94-da4be1bc80a8\") " pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:12:22.512476 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:22.512390 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bnqh9\"" Apr 16 18:12:22.518336 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:22.518317 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mj8zq" Apr 16 18:12:22.639966 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:22.639925 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mj8zq"] Apr 16 18:12:22.642947 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:12:22.642919 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbefca98d_bc99_4ce7_82eb_9f84457dc655.slice/crio-343a4eca5c4389652902862de1bb2665572bf4648459a9f1277f18f0441cff7c WatchSource:0}: Error finding container 343a4eca5c4389652902862de1bb2665572bf4648459a9f1277f18f0441cff7c: Status 404 returned error can't find the container with id 343a4eca5c4389652902862de1bb2665572bf4648459a9f1277f18f0441cff7c Apr 16 18:12:23.424908 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:23.424865 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mj8zq" event={"ID":"befca98d-bc99-4ce7-82eb-9f84457dc655","Type":"ContainerStarted","Data":"343a4eca5c4389652902862de1bb2665572bf4648459a9f1277f18f0441cff7c"} Apr 16 18:12:24.428516 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:24.428481 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mj8zq" event={"ID":"befca98d-bc99-4ce7-82eb-9f84457dc655","Type":"ContainerStarted","Data":"0725eda328a8d6973b06d9707b167395c8dbb28c645ea5e5ca94e0762e4f7f24"} Apr 16 18:12:24.428516 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:24.428518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mj8zq" event={"ID":"befca98d-bc99-4ce7-82eb-9f84457dc655","Type":"ContainerStarted","Data":"a0e560c4af7218b653c485a94570a89ea36e72c56ca2ab6e621de31e2bed57cc"} Apr 16 18:12:24.428984 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:24.428633 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mj8zq" Apr 16 18:12:24.446967 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:24.446915 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mj8zq" podStartSLOduration=129.274355475 podStartE2EDuration="2m10.446897566s" podCreationTimestamp="2026-04-16 18:10:14 +0000 UTC" firstStartedPulling="2026-04-16 18:12:22.644721569 +0000 UTC m=+162.299956404" lastFinishedPulling="2026-04-16 18:12:23.817263664 +0000 UTC m=+163.472498495" observedRunningTime="2026-04-16 18:12:24.446274138 +0000 UTC m=+164.101508992" watchObservedRunningTime="2026-04-16 18:12:24.446897566 +0000 UTC m=+164.102132449" Apr 16 18:12:26.459880 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.459846 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rl96g"] Apr 16 18:12:26.464812 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.464788 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.469754 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.469721 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:12:26.469754 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.469752 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:12:26.470045 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.469762 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:12:26.470045 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.469796 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:12:26.470045 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.469762 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-9j6xp\"" Apr 16 18:12:26.471735 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.471710 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rl96g"] Apr 16 18:12:26.538243 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.538207 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b0990fea-8fdf-473f-ab80-def726bcd0aa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.538466 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.538265 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b0990fea-8fdf-473f-ab80-def726bcd0aa-crio-socket\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.538466 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.538350 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b0990fea-8fdf-473f-ab80-def726bcd0aa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.538466 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.538452 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22ph7\" (UniqueName: \"kubernetes.io/projected/b0990fea-8fdf-473f-ab80-def726bcd0aa-kube-api-access-22ph7\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.538589 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.538487 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b0990fea-8fdf-473f-ab80-def726bcd0aa-data-volume\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.557536 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.557497 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-545cbcd4f-txvcv"] Apr 16 18:12:26.560493 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.560478 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.563552 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.563510 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:12:26.563696 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.563662 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:12:26.563750 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.563668 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:12:26.563991 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.563978 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tt7tt\"" Apr 16 18:12:26.572023 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.571997 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:12:26.583086 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.583055 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-545cbcd4f-txvcv"] Apr 16 18:12:26.639748 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.639706 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b0990fea-8fdf-473f-ab80-def726bcd0aa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.639993 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.639758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-installation-pull-secrets\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.639993 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.639785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-registry-tls\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.639993 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.639814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b0990fea-8fdf-473f-ab80-def726bcd0aa-crio-socket\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.639993 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.639863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-bound-sa-token\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.639993 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.639885 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b0990fea-8fdf-473f-ab80-def726bcd0aa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.639993 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.639909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-ca-trust-extracted\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.639993 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.639934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-image-registry-private-configuration\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.640356 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.640055 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b0990fea-8fdf-473f-ab80-def726bcd0aa-crio-socket\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.640356 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.640053 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsh47\" (UniqueName: \"kubernetes.io/projected/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-kube-api-access-nsh47\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.640356 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.640184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22ph7\" (UniqueName: \"kubernetes.io/projected/b0990fea-8fdf-473f-ab80-def726bcd0aa-kube-api-access-22ph7\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.640356 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.640221 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b0990fea-8fdf-473f-ab80-def726bcd0aa-data-volume\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.640356 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.640253 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-trusted-ca\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.640356 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.640288 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-registry-certificates\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.640653 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.640529 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b0990fea-8fdf-473f-ab80-def726bcd0aa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.642062 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.642033 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b0990fea-8fdf-473f-ab80-def726bcd0aa-data-volume\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.642354 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.642339 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b0990fea-8fdf-473f-ab80-def726bcd0aa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.653456 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.653426 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22ph7\" (UniqueName: \"kubernetes.io/projected/b0990fea-8fdf-473f-ab80-def726bcd0aa-kube-api-access-22ph7\") pod \"insights-runtime-extractor-rl96g\" (UID: \"b0990fea-8fdf-473f-ab80-def726bcd0aa\") " pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.740884 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.740759 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-ca-trust-extracted\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.740884 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.740801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-image-registry-private-configuration\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.740884 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.740852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsh47\" (UniqueName: \"kubernetes.io/projected/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-kube-api-access-nsh47\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.741151 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.740914 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-trusted-ca\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.741151 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.740938 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-registry-certificates\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.741151 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.741067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-installation-pull-secrets\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.741151 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.741118 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-registry-tls\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.741343 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.741166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-bound-sa-token\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.741424 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.741397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-ca-trust-extracted\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.741974 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.741951 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-registry-certificates\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.742123 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.742012 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-trusted-ca\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.743439 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.743411 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-image-registry-private-configuration\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.743557 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.743525 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-installation-pull-secrets\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.743557 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.743524 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-registry-tls\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.756198 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.756156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-bound-sa-token\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.756893 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.756873 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsh47\" (UniqueName: \"kubernetes.io/projected/1af7ee53-3f96-4f7d-8957-898bf7c0c8e9-kube-api-access-nsh47\") pod \"image-registry-545cbcd4f-txvcv\" (UID: \"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9\") " pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.774516 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.774478 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rl96g" Apr 16 18:12:26.869345 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.869310 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:26.916552 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:26.916387 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rl96g"] Apr 16 18:12:26.920122 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:12:26.920076 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0990fea_8fdf_473f_ab80_def726bcd0aa.slice/crio-ae98a6f44b03d8c645a4d6f020ff824ff535eb8b97ab2dc27323726954e7b0b9 WatchSource:0}: Error finding container ae98a6f44b03d8c645a4d6f020ff824ff535eb8b97ab2dc27323726954e7b0b9: Status 404 returned error can't find the container with id ae98a6f44b03d8c645a4d6f020ff824ff535eb8b97ab2dc27323726954e7b0b9 Apr 16 18:12:27.008145 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:27.008067 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-545cbcd4f-txvcv"] Apr 16 18:12:27.011581 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:12:27.011538 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1af7ee53_3f96_4f7d_8957_898bf7c0c8e9.slice/crio-46799d93eb304f49627c34da44392f6030ea2fd4a804d7fbd41c1e2fb6b57bb7 WatchSource:0}: Error finding container 46799d93eb304f49627c34da44392f6030ea2fd4a804d7fbd41c1e2fb6b57bb7: Status 404 returned error can't find the container with id 46799d93eb304f49627c34da44392f6030ea2fd4a804d7fbd41c1e2fb6b57bb7 Apr 16 18:12:27.438171 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:27.438133 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" event={"ID":"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9","Type":"ContainerStarted","Data":"5a1e40e0e9940426cd62e0a4f3eaa7f15c930bcb5b064cec1aa41ff6bbe1b59d"} Apr 16 18:12:27.438171 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:27.438176 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" event={"ID":"1af7ee53-3f96-4f7d-8957-898bf7c0c8e9","Type":"ContainerStarted","Data":"46799d93eb304f49627c34da44392f6030ea2fd4a804d7fbd41c1e2fb6b57bb7"} Apr 16 18:12:27.438471 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:27.438255 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:27.439704 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:27.439678 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rl96g" event={"ID":"b0990fea-8fdf-473f-ab80-def726bcd0aa","Type":"ContainerStarted","Data":"72d3dd20e39dd249d7183a01a909b835bbe912ca9b073c3587bd6a42c63564fe"} Apr 16 18:12:27.439704 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:27.439707 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rl96g" event={"ID":"b0990fea-8fdf-473f-ab80-def726bcd0aa","Type":"ContainerStarted","Data":"ae98a6f44b03d8c645a4d6f020ff824ff535eb8b97ab2dc27323726954e7b0b9"} Apr 16 18:12:27.460493 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:27.460425 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" podStartSLOduration=1.460409855 podStartE2EDuration="1.460409855s" podCreationTimestamp="2026-04-16 18:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:27.459140848 +0000 UTC m=+167.114375728" watchObservedRunningTime="2026-04-16 18:12:27.460409855 +0000 UTC m=+167.115644707" Apr 16 18:12:28.444234 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:28.444191 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rl96g" event={"ID":"b0990fea-8fdf-473f-ab80-def726bcd0aa","Type":"ContainerStarted","Data":"e19350d46d6cd6a94d5075272a2fd240cfb3ad4517d75645f95e25f40f2852d9"} Apr 16 18:12:29.449504 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:29.449471 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rl96g" event={"ID":"b0990fea-8fdf-473f-ab80-def726bcd0aa","Type":"ContainerStarted","Data":"a493d92846eac6c27f417c3f49bf7a25371333ff67395de165167ef452f457fe"} Apr 16 18:12:29.476572 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:29.476519 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rl96g" podStartSLOduration=1.4270442700000001 podStartE2EDuration="3.47650477s" podCreationTimestamp="2026-04-16 18:12:26 +0000 UTC" firstStartedPulling="2026-04-16 18:12:26.979144098 +0000 UTC m=+166.634378929" lastFinishedPulling="2026-04-16 18:12:29.028604597 +0000 UTC m=+168.683839429" observedRunningTime="2026-04-16 18:12:29.476093345 +0000 UTC m=+169.131328208" watchObservedRunningTime="2026-04-16 18:12:29.47650477 +0000 UTC m=+169.131739682" Apr 16 18:12:30.934622 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:30.934579 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:12:31.933186 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:31.933150 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:12:31.936083 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:31.936064 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ffhs9\"" Apr 16 18:12:31.944302 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:31.944271 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2zvsx" Apr 16 18:12:32.064432 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:32.064368 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2zvsx"] Apr 16 18:12:32.068838 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:12:32.068789 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4c20834_fffd_49b6_be94_da4be1bc80a8.slice/crio-40eafcbb34602cb0b1ec2ee45c6fc64f51e4a8070d2ae5c06ae723c38ea76525 WatchSource:0}: Error finding container 40eafcbb34602cb0b1ec2ee45c6fc64f51e4a8070d2ae5c06ae723c38ea76525: Status 404 returned error can't find the container with id 40eafcbb34602cb0b1ec2ee45c6fc64f51e4a8070d2ae5c06ae723c38ea76525 Apr 16 18:12:32.459241 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:32.459203 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2zvsx" event={"ID":"e4c20834-fffd-49b6-be94-da4be1bc80a8","Type":"ContainerStarted","Data":"40eafcbb34602cb0b1ec2ee45c6fc64f51e4a8070d2ae5c06ae723c38ea76525"} Apr 16 18:12:33.196307 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.196267 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-xwkwz"] Apr 16 18:12:33.200947 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.200920 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.205518 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.205149 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:12:33.205518 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.205174 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:12:33.205518 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.205194 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 18:12:33.205518 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.205237 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 18:12:33.205518 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.205247 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-qtppp\"" Apr 16 18:12:33.205518 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.205149 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:12:33.211459 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.211425 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-xwkwz"] Apr 16 18:12:33.297727 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.297689 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.297943 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.297739 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slqz4\" (UniqueName: \"kubernetes.io/projected/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-kube-api-access-slqz4\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.297943 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.297778 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.297943 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.297814 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-metrics-client-ca\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.398773 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.398725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.398988 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.398799 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slqz4\" (UniqueName: \"kubernetes.io/projected/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-kube-api-access-slqz4\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.398988 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.398861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.398988 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.398895 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-metrics-client-ca\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.399758 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:12:33.399530 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 18:12:33.399758 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:12:33.399611 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-prometheus-operator-tls podName:bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef nodeName:}" failed. No retries permitted until 2026-04-16 18:12:33.899588564 +0000 UTC m=+173.554823401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-prometheus-operator-tls") pod "prometheus-operator-78f957474d-xwkwz" (UID: "bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef") : secret "prometheus-operator-tls" not found Apr 16 18:12:33.399758 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.399708 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-metrics-client-ca\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.402367 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.402341 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.411779 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.411748 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slqz4\" (UniqueName: \"kubernetes.io/projected/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-kube-api-access-slqz4\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.902228 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.902191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:33.904721 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:33.904681 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-xwkwz\" (UID: \"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:34.112950 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:34.112905 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" Apr 16 18:12:34.233514 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:34.233480 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-xwkwz"] Apr 16 18:12:34.237509 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:12:34.237469 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc9d6d92_ba2b_4d9e_b44b_455ceaa353ef.slice/crio-cff8f6fb6eba26b265a704425cf51209bd9cddd4648c408ef31d1b41fae99327 WatchSource:0}: Error finding container cff8f6fb6eba26b265a704425cf51209bd9cddd4648c408ef31d1b41fae99327: Status 404 returned error can't find the container with id cff8f6fb6eba26b265a704425cf51209bd9cddd4648c408ef31d1b41fae99327 Apr 16 18:12:34.433418 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:34.433336 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mj8zq" Apr 16 18:12:34.466363 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:34.466307 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2zvsx" event={"ID":"e4c20834-fffd-49b6-be94-da4be1bc80a8","Type":"ContainerStarted","Data":"18c1f880fdb658b612b72af1f51f933b5cfbc3ba12fffa5d6b86c1522856db0c"} Apr 16 18:12:34.467686 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:34.467652 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" event={"ID":"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef","Type":"ContainerStarted","Data":"cff8f6fb6eba26b265a704425cf51209bd9cddd4648c408ef31d1b41fae99327"} Apr 16 18:12:34.483558 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:34.483503 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2zvsx" podStartSLOduration=138.988932201 podStartE2EDuration="2m20.483486613s" podCreationTimestamp="2026-04-16 18:10:14 +0000 UTC" firstStartedPulling="2026-04-16 18:12:32.070722451 +0000 UTC m=+171.725957282" lastFinishedPulling="2026-04-16 18:12:33.565276863 +0000 UTC m=+173.220511694" observedRunningTime="2026-04-16 18:12:34.482966857 +0000 UTC m=+174.138201724" watchObservedRunningTime="2026-04-16 18:12:34.483486613 +0000 UTC m=+174.138721486" Apr 16 18:12:36.474284 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:36.474244 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" event={"ID":"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef","Type":"ContainerStarted","Data":"dc7af2a79a24e79dddd492fcb1273ed92b05c3509dc2a0073e1e40c67297ec91"} Apr 16 18:12:36.474284 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:36.474291 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" event={"ID":"bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef","Type":"ContainerStarted","Data":"e1850f9c431ebff4e45fa4461294512a3f3d67c56147f4fe1237c0a599d32ce4"} Apr 16 18:12:36.492966 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:36.492920 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-xwkwz" podStartSLOduration=2.237969069 podStartE2EDuration="3.49290488s" podCreationTimestamp="2026-04-16 18:12:33 +0000 UTC" firstStartedPulling="2026-04-16 18:12:34.239416378 +0000 UTC m=+173.894651211" lastFinishedPulling="2026-04-16 18:12:35.494352176 +0000 UTC m=+175.149587022" observedRunningTime="2026-04-16 18:12:36.492147568 +0000 UTC m=+176.147382423" watchObservedRunningTime="2026-04-16 18:12:36.49290488 +0000 UTC m=+176.148139733" Apr 16 18:12:38.435172 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.433400 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b97668b7f-lm46z"] Apr 16 18:12:38.438092 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.438061 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.440713 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.440694 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:12:38.441877 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.441856 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:12:38.443231 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.443213 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:12:38.443346 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.443329 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:12:38.443408 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.443222 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:12:38.443408 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.443364 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:12:38.443664 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.443650 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:12:38.443765 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.443747 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-p8ftn\"" Apr 16 18:12:38.447945 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.447922 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:12:38.448713 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.448690 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b97668b7f-lm46z"] Apr 16 18:12:38.544717 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.544680 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-config\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.544934 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.544721 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-oauth-serving-cert\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.544934 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.544754 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d84zx\" (UniqueName: \"kubernetes.io/projected/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-kube-api-access-d84zx\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.544934 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.544814 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-oauth-config\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.544934 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.544886 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-trusted-ca-bundle\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.544934 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.544919 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-serving-cert\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.545101 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.544946 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-service-ca\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.645591 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.645554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-serving-cert\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.645591 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.645592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-service-ca\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.645846 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.645618 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-config\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.645846 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.645643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-oauth-serving-cert\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.645846 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.645675 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d84zx\" (UniqueName: \"kubernetes.io/projected/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-kube-api-access-d84zx\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.645846 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.645729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-oauth-config\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.645846 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.645769 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-trusted-ca-bundle\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.646467 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.646437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-service-ca\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.646583 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.646514 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-oauth-serving-cert\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.646583 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.646512 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-config\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.646699 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.646633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-trusted-ca-bundle\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.648181 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.648153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-serving-cert\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.648296 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.648229 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-oauth-config\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.655227 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.655205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d84zx\" (UniqueName: \"kubernetes.io/projected/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-kube-api-access-d84zx\") pod \"console-5b97668b7f-lm46z\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.749304 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.749202 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:12:38.890208 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:38.890177 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b97668b7f-lm46z"] Apr 16 18:12:38.893542 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:12:38.893511 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26f3e105_8fb9_48ef_b9a3_f1c27fcc5c90.slice/crio-311b2a62a5249a95a8f45fc8e1b1b2ea6c8dd220eb2f650dcfd9724775c84d8a WatchSource:0}: Error finding container 311b2a62a5249a95a8f45fc8e1b1b2ea6c8dd220eb2f650dcfd9724775c84d8a: Status 404 returned error can't find the container with id 311b2a62a5249a95a8f45fc8e1b1b2ea6c8dd220eb2f650dcfd9724775c84d8a Apr 16 18:12:39.484198 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:39.484160 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b97668b7f-lm46z" event={"ID":"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90","Type":"ContainerStarted","Data":"311b2a62a5249a95a8f45fc8e1b1b2ea6c8dd220eb2f650dcfd9724775c84d8a"} Apr 16 18:12:42.494439 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:42.494398 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b97668b7f-lm46z" event={"ID":"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90","Type":"ContainerStarted","Data":"fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be"} Apr 16 18:12:42.514847 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:42.514786 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b97668b7f-lm46z" podStartSLOduration=1.748435784 podStartE2EDuration="4.514768811s" podCreationTimestamp="2026-04-16 18:12:38 +0000 UTC" firstStartedPulling="2026-04-16 18:12:38.895395088 +0000 UTC m=+178.550629920" lastFinishedPulling="2026-04-16 18:12:41.661728112 +0000 UTC m=+181.316962947" observedRunningTime="2026-04-16 18:12:42.512596723 +0000 UTC m=+182.167831575" watchObservedRunningTime="2026-04-16 18:12:42.514768811 +0000 UTC m=+182.170003663" Apr 16 18:12:42.939941 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:42.939910 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6r68b"] Apr 16 18:12:42.943356 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:42.943331 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:42.945799 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:42.945770 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:12:42.945953 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:42.945896 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:12:42.945953 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:42.945923 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:12:42.946119 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:42.946106 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qbj8r\"" Apr 16 18:12:43.088337 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.088294 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1a22ad6-2e26-41ba-918c-624abea492fb-sys\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.088493 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.088346 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-textfile\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.088493 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.088465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.088603 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.088531 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c1a22ad6-2e26-41ba-918c-624abea492fb-root\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.088603 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.088589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-tls\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.088685 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.088614 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtjxt\" (UniqueName: \"kubernetes.io/projected/c1a22ad6-2e26-41ba-918c-624abea492fb-kube-api-access-xtjxt\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.088685 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.088648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-wtmp\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.088685 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.088673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-accelerators-collector-config\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.088793 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.088707 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1a22ad6-2e26-41ba-918c-624abea492fb-metrics-client-ca\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.189892 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.189855 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-tls\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.189892 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.189894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtjxt\" (UniqueName: \"kubernetes.io/projected/c1a22ad6-2e26-41ba-918c-624abea492fb-kube-api-access-xtjxt\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190147 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.189914 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-wtmp\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190147 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:12:43.190026 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:12:43.190147 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.190049 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-accelerators-collector-config\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190147 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:12:43.190105 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-tls podName:c1a22ad6-2e26-41ba-918c-624abea492fb nodeName:}" failed. No retries permitted until 2026-04-16 18:12:43.690084123 +0000 UTC m=+183.345318959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-tls") pod "node-exporter-6r68b" (UID: "c1a22ad6-2e26-41ba-918c-624abea492fb") : secret "node-exporter-tls" not found Apr 16 18:12:43.190147 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.190145 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1a22ad6-2e26-41ba-918c-624abea492fb-metrics-client-ca\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190438 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.190194 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1a22ad6-2e26-41ba-918c-624abea492fb-sys\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190438 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.190219 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-textfile\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190438 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.190253 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1a22ad6-2e26-41ba-918c-624abea492fb-sys\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190438 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.190303 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190438 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.190219 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-wtmp\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190678 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.190472 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c1a22ad6-2e26-41ba-918c-624abea492fb-root\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190678 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.190552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-textfile\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190678 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.190575 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c1a22ad6-2e26-41ba-918c-624abea492fb-root\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190790 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.190716 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1a22ad6-2e26-41ba-918c-624abea492fb-metrics-client-ca\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.190790 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.190722 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-accelerators-collector-config\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.192842 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.192805 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.202718 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.202688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtjxt\" (UniqueName: \"kubernetes.io/projected/c1a22ad6-2e26-41ba-918c-624abea492fb-kube-api-access-xtjxt\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.695973 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.695936 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-tls\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.698328 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.698307 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c1a22ad6-2e26-41ba-918c-624abea492fb-node-exporter-tls\") pod \"node-exporter-6r68b\" (UID: \"c1a22ad6-2e26-41ba-918c-624abea492fb\") " pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.853260 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:43.853226 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6r68b" Apr 16 18:12:43.862439 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:12:43.862399 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1a22ad6_2e26_41ba_918c_624abea492fb.slice/crio-1ee9be3d2348a65e7d5ee10e91346a990bc17f3a8266f963b910a5c00f105c82 WatchSource:0}: Error finding container 1ee9be3d2348a65e7d5ee10e91346a990bc17f3a8266f963b910a5c00f105c82: Status 404 returned error can't find the container with id 1ee9be3d2348a65e7d5ee10e91346a990bc17f3a8266f963b910a5c00f105c82 Apr 16 18:12:44.501366 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:44.501331 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6r68b" event={"ID":"c1a22ad6-2e26-41ba-918c-624abea492fb","Type":"ContainerStarted","Data":"1ee9be3d2348a65e7d5ee10e91346a990bc17f3a8266f963b910a5c00f105c82"} Apr 16 18:12:45.505888 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:45.505847 2574 generic.go:358] "Generic (PLEG): container finished" podID="c1a22ad6-2e26-41ba-918c-624abea492fb" containerID="6988c39ab5f88c159d92d0bbde20cbe86f088c52d6dbd88ad41381648fdc2cd9" exitCode=0 Apr 16 18:12:45.506254 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:45.505895 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6r68b" event={"ID":"c1a22ad6-2e26-41ba-918c-624abea492fb","Type":"ContainerDied","Data":"6988c39ab5f88c159d92d0bbde20cbe86f088c52d6dbd88ad41381648fdc2cd9"} Apr 16 18:12:46.510605 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:46.510569 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6r68b" event={"ID":"c1a22ad6-2e26-41ba-918c-624abea492fb","Type":"ContainerStarted","Data":"23f28ec4db0546eb05379b7c4497ecd3f537bbd6261428a1ff4c5fa4b0481b79"} Apr 16 18:12:46.510605 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:46.510605 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6r68b" event={"ID":"c1a22ad6-2e26-41ba-918c-624abea492fb","Type":"ContainerStarted","Data":"9273e2df7a87b3a426aa4533204ba4c6b63437ae2d9349dac5d5eb370474f4db"} Apr 16 18:12:46.535992 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:46.535938 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6r68b" podStartSLOduration=3.560545391 podStartE2EDuration="4.535923801s" podCreationTimestamp="2026-04-16 18:12:42 +0000 UTC" firstStartedPulling="2026-04-16 18:12:43.864621474 +0000 UTC m=+183.519856309" lastFinishedPulling="2026-04-16 18:12:44.839999887 +0000 UTC m=+184.495234719" observedRunningTime="2026-04-16 18:12:46.534197615 +0000 UTC m=+186.189432468" watchObservedRunningTime="2026-04-16 18:12:46.535923801 +0000 UTC m=+186.191158653" Apr 16 18:12:48.253484 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:48.253447 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b97668b7f-lm46z"] Apr 16 18:12:48.449037 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:48.449004 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-545cbcd4f-txvcv" Apr 16 18:12:48.750011 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:12:48.749969 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:13:09.570620 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:09.570584 2574 generic.go:358] "Generic (PLEG): container finished" podID="e47f34ff-4372-4215-ba86-42576df70e3d" containerID="1a8ada349354f4a1a58d9952cf2d9dcea4447326e68651e807ea22c51fa32199" exitCode=0 Apr 16 18:13:09.571115 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:09.570657 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" event={"ID":"e47f34ff-4372-4215-ba86-42576df70e3d","Type":"ContainerDied","Data":"1a8ada349354f4a1a58d9952cf2d9dcea4447326e68651e807ea22c51fa32199"} Apr 16 18:13:09.571115 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:09.571002 2574 scope.go:117] "RemoveContainer" containerID="1a8ada349354f4a1a58d9952cf2d9dcea4447326e68651e807ea22c51fa32199" Apr 16 18:13:10.574993 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:10.574956 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-r45bl" event={"ID":"e47f34ff-4372-4215-ba86-42576df70e3d","Type":"ContainerStarted","Data":"a968f1619d40629785d9cf75c3c51a0b89610ebdca4fd0937069930eeae651e5"} Apr 16 18:13:13.272886 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.272809 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5b97668b7f-lm46z" podUID="26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" containerName="console" containerID="cri-o://fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be" gracePeriod=15 Apr 16 18:13:13.511765 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.511742 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b97668b7f-lm46z_26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90/console/0.log" Apr 16 18:13:13.511905 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.511805 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:13:13.545677 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.545590 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-serving-cert\") pod \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " Apr 16 18:13:13.545677 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.545629 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-trusted-ca-bundle\") pod \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " Apr 16 18:13:13.545677 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.545651 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d84zx\" (UniqueName: \"kubernetes.io/projected/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-kube-api-access-d84zx\") pod \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " Apr 16 18:13:13.545677 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.545678 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-oauth-config\") pod \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " Apr 16 18:13:13.546039 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.545704 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-service-ca\") pod \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " Apr 16 18:13:13.546039 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.545765 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-oauth-serving-cert\") pod \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " Apr 16 18:13:13.546039 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.545784 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-config\") pod \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\" (UID: \"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90\") " Apr 16 18:13:13.546200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.546073 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" (UID: "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:13.546200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.546177 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" (UID: "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:13.546695 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.546664 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-service-ca" (OuterVolumeSpecName: "service-ca") pod "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" (UID: "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:13.546799 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.546706 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-config" (OuterVolumeSpecName: "console-config") pod "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" (UID: "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:13.548135 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.548109 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" (UID: "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:13.548243 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.548223 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-kube-api-access-d84zx" (OuterVolumeSpecName: "kube-api-access-d84zx") pod "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" (UID: "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90"). InnerVolumeSpecName "kube-api-access-d84zx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:13:13.548243 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.548233 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" (UID: "26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:13.584726 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.584696 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b97668b7f-lm46z_26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90/console/0.log" Apr 16 18:13:13.584935 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.584735 2574 generic.go:358] "Generic (PLEG): container finished" podID="26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" containerID="fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be" exitCode=2 Apr 16 18:13:13.584935 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.584769 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b97668b7f-lm46z" event={"ID":"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90","Type":"ContainerDied","Data":"fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be"} Apr 16 18:13:13.584935 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.584800 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b97668b7f-lm46z" Apr 16 18:13:13.584935 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.584808 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b97668b7f-lm46z" event={"ID":"26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90","Type":"ContainerDied","Data":"311b2a62a5249a95a8f45fc8e1b1b2ea6c8dd220eb2f650dcfd9724775c84d8a"} Apr 16 18:13:13.584935 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.584849 2574 scope.go:117] "RemoveContainer" containerID="fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be" Apr 16 18:13:13.593428 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.593399 2574 scope.go:117] "RemoveContainer" containerID="fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be" Apr 16 18:13:13.593719 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:13:13.593698 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be\": container with ID starting with fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be not found: ID does not exist" containerID="fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be" Apr 16 18:13:13.593799 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.593732 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be"} err="failed to get container status \"fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be\": rpc error: code = NotFound desc = could not find container \"fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be\": container with ID starting with fc2f9cb536e7a6640acfe514c196a34546f117c6c3685ceb17d523adf33a72be not found: ID does not exist" Apr 16 18:13:13.606715 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.606686 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b97668b7f-lm46z"] Apr 16 18:13:13.611105 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.611076 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b97668b7f-lm46z"] Apr 16 18:13:13.647264 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.647227 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-serving-cert\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:13:13.647264 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.647258 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-trusted-ca-bundle\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:13:13.647430 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.647270 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d84zx\" (UniqueName: \"kubernetes.io/projected/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-kube-api-access-d84zx\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:13:13.647430 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.647293 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-oauth-config\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:13:13.647430 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.647303 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-service-ca\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:13:13.647430 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.647313 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-oauth-serving-cert\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:13:13.647430 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:13.647322 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90-console-config\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:13:14.937743 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:14.937711 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" path="/var/lib/kubelet/pods/26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90/volumes" Apr 16 18:13:52.770274 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:52.770173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:13:52.772653 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:52.772619 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/182ef3ca-8527-40a2-b1a7-c714bd3509c5-metrics-certs\") pod \"network-metrics-daemon-kgtvr\" (UID: \"182ef3ca-8527-40a2-b1a7-c714bd3509c5\") " pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:13:52.838729 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:52.838696 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kjmpp\"" Apr 16 18:13:52.846332 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:52.846282 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgtvr" Apr 16 18:13:52.974882 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:52.974847 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kgtvr"] Apr 16 18:13:52.977891 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:13:52.977855 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod182ef3ca_8527_40a2_b1a7_c714bd3509c5.slice/crio-0783e5dec10f0d570dad1ceec589a76a1cfd809b579d6e0066cecade1075fd5a WatchSource:0}: Error finding container 0783e5dec10f0d570dad1ceec589a76a1cfd809b579d6e0066cecade1075fd5a: Status 404 returned error can't find the container with id 0783e5dec10f0d570dad1ceec589a76a1cfd809b579d6e0066cecade1075fd5a Apr 16 18:13:53.700888 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:53.700850 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kgtvr" event={"ID":"182ef3ca-8527-40a2-b1a7-c714bd3509c5","Type":"ContainerStarted","Data":"0783e5dec10f0d570dad1ceec589a76a1cfd809b579d6e0066cecade1075fd5a"} Apr 16 18:13:54.705084 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:54.705038 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kgtvr" event={"ID":"182ef3ca-8527-40a2-b1a7-c714bd3509c5","Type":"ContainerStarted","Data":"028fcd9e8612afe8df460638ab4dd5eeef21a47ee4b65ffb27d2322b572d1a8f"} Apr 16 18:13:54.705084 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:54.705074 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kgtvr" event={"ID":"182ef3ca-8527-40a2-b1a7-c714bd3509c5","Type":"ContainerStarted","Data":"381ea74334c5fb73000df4f2347ec388870973fb99d550b948d41c736c51c256"} Apr 16 18:13:54.722365 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:13:54.722298 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kgtvr" podStartSLOduration=252.798992593 podStartE2EDuration="4m13.722278004s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:13:52.97978068 +0000 UTC m=+252.635015516" lastFinishedPulling="2026-04-16 18:13:53.903066092 +0000 UTC m=+253.558300927" observedRunningTime="2026-04-16 18:13:54.721362193 +0000 UTC m=+254.376597045" watchObservedRunningTime="2026-04-16 18:13:54.722278004 +0000 UTC m=+254.377512856" Apr 16 18:14:40.819744 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:14:40.819710 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:14:40.820941 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:14:40.820915 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:14:40.825540 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:14:40.825507 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:14:40.826921 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:14:40.826890 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:14:40.829265 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:14:40.829247 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:17:44.701970 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.701932 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2"] Apr 16 18:17:44.702451 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.702214 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" containerName="console" Apr 16 18:17:44.702451 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.702226 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" containerName="console" Apr 16 18:17:44.702451 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.702282 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="26f3e105-8fb9-48ef-b9a3-f1c27fcc5c90" containerName="console" Apr 16 18:17:44.704949 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.704931 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" Apr 16 18:17:44.707505 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.707478 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:17:44.707654 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.707490 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:17:44.707654 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.707598 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-qqcls\"" Apr 16 18:17:44.707654 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.707606 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:17:44.714655 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.714627 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2"] Apr 16 18:17:44.818043 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.817979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7zt\" (UniqueName: \"kubernetes.io/projected/98cea52f-dcae-42bd-840a-c9e96a4f6601-kube-api-access-gj7zt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2\" (UID: \"98cea52f-dcae-42bd-840a-c9e96a4f6601\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" Apr 16 18:17:44.818231 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.818074 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/98cea52f-dcae-42bd-840a-c9e96a4f6601-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2\" (UID: \"98cea52f-dcae-42bd-840a-c9e96a4f6601\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" Apr 16 18:17:44.919296 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.919240 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/98cea52f-dcae-42bd-840a-c9e96a4f6601-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2\" (UID: \"98cea52f-dcae-42bd-840a-c9e96a4f6601\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" Apr 16 18:17:44.919447 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.919349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gj7zt\" (UniqueName: \"kubernetes.io/projected/98cea52f-dcae-42bd-840a-c9e96a4f6601-kube-api-access-gj7zt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2\" (UID: \"98cea52f-dcae-42bd-840a-c9e96a4f6601\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" Apr 16 18:17:44.921711 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.921690 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/98cea52f-dcae-42bd-840a-c9e96a4f6601-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2\" (UID: \"98cea52f-dcae-42bd-840a-c9e96a4f6601\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" Apr 16 18:17:44.929912 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:44.929890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj7zt\" (UniqueName: \"kubernetes.io/projected/98cea52f-dcae-42bd-840a-c9e96a4f6601-kube-api-access-gj7zt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2\" (UID: \"98cea52f-dcae-42bd-840a-c9e96a4f6601\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" Apr 16 18:17:45.016062 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:45.015970 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" Apr 16 18:17:45.147215 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:45.147147 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2"] Apr 16 18:17:45.150214 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:17:45.150184 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cea52f_dcae_42bd_840a_c9e96a4f6601.slice/crio-7b1048a70c9c2b43f6073cb3c4790934b2ddc7a4f73e29ec732e3db40a80ebfa WatchSource:0}: Error finding container 7b1048a70c9c2b43f6073cb3c4790934b2ddc7a4f73e29ec732e3db40a80ebfa: Status 404 returned error can't find the container with id 7b1048a70c9c2b43f6073cb3c4790934b2ddc7a4f73e29ec732e3db40a80ebfa Apr 16 18:17:45.152014 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:45.151994 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:17:45.337476 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:45.337386 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" event={"ID":"98cea52f-dcae-42bd-840a-c9e96a4f6601","Type":"ContainerStarted","Data":"7b1048a70c9c2b43f6073cb3c4790934b2ddc7a4f73e29ec732e3db40a80ebfa"} Apr 16 18:17:49.350792 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:49.350756 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" event={"ID":"98cea52f-dcae-42bd-840a-c9e96a4f6601","Type":"ContainerStarted","Data":"79518efe324d594ffc0d1d2bf2b68d34cc8d0ca522c39068e93bb86304f890a6"} Apr 16 18:17:49.351214 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:49.350807 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" Apr 16 18:17:49.380509 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:17:49.380457 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" podStartSLOduration=2.227195847 podStartE2EDuration="5.380440729s" podCreationTimestamp="2026-04-16 18:17:44 +0000 UTC" firstStartedPulling="2026-04-16 18:17:45.152173463 +0000 UTC m=+484.807408294" lastFinishedPulling="2026-04-16 18:17:48.305418342 +0000 UTC m=+487.960653176" observedRunningTime="2026-04-16 18:17:49.380011053 +0000 UTC m=+489.035245935" watchObservedRunningTime="2026-04-16 18:17:49.380440729 +0000 UTC m=+489.035675643" Apr 16 18:18:10.356456 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:10.356426 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-pzjx2" Apr 16 18:18:58.003535 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.003499 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-gdz27"] Apr 16 18:18:58.005392 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.005375 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" Apr 16 18:18:58.007924 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.007898 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-s7jwt\"" Apr 16 18:18:58.007924 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.007917 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 18:18:58.008101 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.007969 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 18:18:58.014519 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.014492 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-gdz27"] Apr 16 18:18:58.083931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.083885 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zntdf\" (UniqueName: \"kubernetes.io/projected/88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7-kube-api-access-zntdf\") pod \"cert-manager-webhook-597b96b99b-gdz27\" (UID: \"88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" Apr 16 18:18:58.084117 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.084026 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-gdz27\" (UID: \"88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" Apr 16 18:18:58.185001 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.184963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zntdf\" (UniqueName: \"kubernetes.io/projected/88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7-kube-api-access-zntdf\") pod \"cert-manager-webhook-597b96b99b-gdz27\" (UID: \"88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" Apr 16 18:18:58.185180 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.185034 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-gdz27\" (UID: \"88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" Apr 16 18:18:58.196045 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.196011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-gdz27\" (UID: \"88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" Apr 16 18:18:58.196174 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.196022 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zntdf\" (UniqueName: \"kubernetes.io/projected/88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7-kube-api-access-zntdf\") pod \"cert-manager-webhook-597b96b99b-gdz27\" (UID: \"88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" Apr 16 18:18:58.315357 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.315240 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" Apr 16 18:18:58.438111 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.438086 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-gdz27"] Apr 16 18:18:58.440989 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:18:58.440966 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ec539e_745c_4c6a_b1a7_6cd0dbfb46b7.slice/crio-90729a6fa90652482f96e83c21f29ed8aaba48d4c90386a8b440d159f54ccf63 WatchSource:0}: Error finding container 90729a6fa90652482f96e83c21f29ed8aaba48d4c90386a8b440d159f54ccf63: Status 404 returned error can't find the container with id 90729a6fa90652482f96e83c21f29ed8aaba48d4c90386a8b440d159f54ccf63 Apr 16 18:18:58.544850 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:18:58.544784 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" event={"ID":"88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7","Type":"ContainerStarted","Data":"90729a6fa90652482f96e83c21f29ed8aaba48d4c90386a8b440d159f54ccf63"} Apr 16 18:19:00.765317 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:00.765279 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-q4tz9"] Apr 16 18:19:00.767924 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:00.767896 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-q4tz9" Apr 16 18:19:00.770484 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:00.770452 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-q976b\"" Apr 16 18:19:00.778225 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:00.778194 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-q4tz9"] Apr 16 18:19:00.910241 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:00.910193 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41d6deef-c100-40b9-b036-86b8d53a56c3-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-q4tz9\" (UID: \"41d6deef-c100-40b9-b036-86b8d53a56c3\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-q4tz9" Apr 16 18:19:00.910449 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:00.910331 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7rn7\" (UniqueName: \"kubernetes.io/projected/41d6deef-c100-40b9-b036-86b8d53a56c3-kube-api-access-t7rn7\") pod \"cert-manager-cainjector-8966b78d4-q4tz9\" (UID: \"41d6deef-c100-40b9-b036-86b8d53a56c3\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-q4tz9" Apr 16 18:19:01.011647 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:01.011600 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7rn7\" (UniqueName: \"kubernetes.io/projected/41d6deef-c100-40b9-b036-86b8d53a56c3-kube-api-access-t7rn7\") pod \"cert-manager-cainjector-8966b78d4-q4tz9\" (UID: \"41d6deef-c100-40b9-b036-86b8d53a56c3\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-q4tz9" Apr 16 18:19:01.011870 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:01.011690 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41d6deef-c100-40b9-b036-86b8d53a56c3-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-q4tz9\" (UID: \"41d6deef-c100-40b9-b036-86b8d53a56c3\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-q4tz9" Apr 16 18:19:01.021542 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:01.021456 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41d6deef-c100-40b9-b036-86b8d53a56c3-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-q4tz9\" (UID: \"41d6deef-c100-40b9-b036-86b8d53a56c3\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-q4tz9" Apr 16 18:19:01.021542 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:01.021468 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7rn7\" (UniqueName: \"kubernetes.io/projected/41d6deef-c100-40b9-b036-86b8d53a56c3-kube-api-access-t7rn7\") pod \"cert-manager-cainjector-8966b78d4-q4tz9\" (UID: \"41d6deef-c100-40b9-b036-86b8d53a56c3\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-q4tz9" Apr 16 18:19:01.080427 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:01.080389 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-q4tz9" Apr 16 18:19:01.224045 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:01.224009 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-q4tz9"] Apr 16 18:19:01.226463 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:19:01.226423 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41d6deef_c100_40b9_b036_86b8d53a56c3.slice/crio-55b6a186c62e2c9d7acc168579ae8076a9880c78bb71f8639c0c9f0c52c15ead WatchSource:0}: Error finding container 55b6a186c62e2c9d7acc168579ae8076a9880c78bb71f8639c0c9f0c52c15ead: Status 404 returned error can't find the container with id 55b6a186c62e2c9d7acc168579ae8076a9880c78bb71f8639c0c9f0c52c15ead Apr 16 18:19:01.554484 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:01.554428 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-q4tz9" event={"ID":"41d6deef-c100-40b9-b036-86b8d53a56c3","Type":"ContainerStarted","Data":"55b6a186c62e2c9d7acc168579ae8076a9880c78bb71f8639c0c9f0c52c15ead"} Apr 16 18:19:03.564249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:03.564215 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" event={"ID":"88ec539e-745c-4c6a-b1a7-6cd0dbfb46b7","Type":"ContainerStarted","Data":"087e49d71f28df1781721caee686e1ad4681b18fb37476843aa41fda260ec2da"} Apr 16 18:19:03.564674 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:03.564275 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" Apr 16 18:19:03.565650 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:03.565627 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-q4tz9" event={"ID":"41d6deef-c100-40b9-b036-86b8d53a56c3","Type":"ContainerStarted","Data":"de766a1bda61481d5bbc80c99c3fb49030075e77d3478598f0297d260482a582"} Apr 16 18:19:03.582327 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:03.582272 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" podStartSLOduration=2.5296741320000002 podStartE2EDuration="6.582247083s" podCreationTimestamp="2026-04-16 18:18:57 +0000 UTC" firstStartedPulling="2026-04-16 18:18:58.443431497 +0000 UTC m=+558.098666328" lastFinishedPulling="2026-04-16 18:19:02.496004433 +0000 UTC m=+562.151239279" observedRunningTime="2026-04-16 18:19:03.58172156 +0000 UTC m=+563.236956411" watchObservedRunningTime="2026-04-16 18:19:03.582247083 +0000 UTC m=+563.237481934" Apr 16 18:19:03.599919 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:03.599858 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-q4tz9" podStartSLOduration=2.332334202 podStartE2EDuration="3.599810111s" podCreationTimestamp="2026-04-16 18:19:00 +0000 UTC" firstStartedPulling="2026-04-16 18:19:01.228913192 +0000 UTC m=+560.884148023" lastFinishedPulling="2026-04-16 18:19:02.496389101 +0000 UTC m=+562.151623932" observedRunningTime="2026-04-16 18:19:03.599580638 +0000 UTC m=+563.254815491" watchObservedRunningTime="2026-04-16 18:19:03.599810111 +0000 UTC m=+563.255044967" Apr 16 18:19:09.570402 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:09.570368 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-gdz27" Apr 16 18:19:15.876371 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:15.876333 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-dbthm"] Apr 16 18:19:15.880394 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:15.880377 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-dbthm" Apr 16 18:19:15.882929 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:15.882904 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-xx7v6\"" Apr 16 18:19:15.889440 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:15.889411 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-dbthm"] Apr 16 18:19:16.028183 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:16.028147 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85-bound-sa-token\") pod \"cert-manager-759f64656b-dbthm\" (UID: \"aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85\") " pod="cert-manager/cert-manager-759f64656b-dbthm" Apr 16 18:19:16.028370 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:16.028204 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s7r9\" (UniqueName: \"kubernetes.io/projected/aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85-kube-api-access-2s7r9\") pod \"cert-manager-759f64656b-dbthm\" (UID: \"aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85\") " pod="cert-manager/cert-manager-759f64656b-dbthm" Apr 16 18:19:16.129577 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:16.129489 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85-bound-sa-token\") pod \"cert-manager-759f64656b-dbthm\" (UID: \"aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85\") " pod="cert-manager/cert-manager-759f64656b-dbthm" Apr 16 18:19:16.129577 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:16.129529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2s7r9\" (UniqueName: \"kubernetes.io/projected/aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85-kube-api-access-2s7r9\") pod \"cert-manager-759f64656b-dbthm\" (UID: \"aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85\") " pod="cert-manager/cert-manager-759f64656b-dbthm" Apr 16 18:19:16.146546 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:16.146517 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s7r9\" (UniqueName: \"kubernetes.io/projected/aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85-kube-api-access-2s7r9\") pod \"cert-manager-759f64656b-dbthm\" (UID: \"aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85\") " pod="cert-manager/cert-manager-759f64656b-dbthm" Apr 16 18:19:16.147655 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:16.147636 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85-bound-sa-token\") pod \"cert-manager-759f64656b-dbthm\" (UID: \"aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85\") " pod="cert-manager/cert-manager-759f64656b-dbthm" Apr 16 18:19:16.189802 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:16.189767 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-dbthm" Apr 16 18:19:16.312207 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:16.312167 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-dbthm"] Apr 16 18:19:16.315934 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:19:16.315904 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa89fba2_1e51_4dc3_8ed2_a9e8ece01c85.slice/crio-a60367e5c5513dc00b7e4f5afb51b7398c2bf1bd8c0f7ef4c8105eea0185a2d9 WatchSource:0}: Error finding container a60367e5c5513dc00b7e4f5afb51b7398c2bf1bd8c0f7ef4c8105eea0185a2d9: Status 404 returned error can't find the container with id a60367e5c5513dc00b7e4f5afb51b7398c2bf1bd8c0f7ef4c8105eea0185a2d9 Apr 16 18:19:16.605068 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:16.605028 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-dbthm" event={"ID":"aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85","Type":"ContainerStarted","Data":"adbdae929a3e2ae622579bea28772fa1eda3c4bae97433d8447e58af022e6741"} Apr 16 18:19:16.605068 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:16.605068 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-dbthm" event={"ID":"aa89fba2-1e51-4dc3-8ed2-a9e8ece01c85","Type":"ContainerStarted","Data":"a60367e5c5513dc00b7e4f5afb51b7398c2bf1bd8c0f7ef4c8105eea0185a2d9"} Apr 16 18:19:16.623672 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:16.623618 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-dbthm" podStartSLOduration=1.623602223 podStartE2EDuration="1.623602223s" podCreationTimestamp="2026-04-16 18:19:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:16.621444403 +0000 UTC m=+576.276679273" watchObservedRunningTime="2026-04-16 18:19:16.623602223 +0000 UTC m=+576.278837076" Apr 16 18:19:40.841600 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:40.841573 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:19:40.842099 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:40.841602 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:19:40.848622 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:40.848596 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:19:40.848796 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:40.848735 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:19:45.685289 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.685251 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw"] Apr 16 18:19:45.688541 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.688518 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.692184 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.692157 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:45.692344 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.692203 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-w9fgb\"" Apr 16 18:19:45.692344 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.692229 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 18:19:45.692344 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.692289 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 18:19:45.692526 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.692484 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:45.692711 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.692693 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 18:19:45.703842 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.703799 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw"] Apr 16 18:19:45.764665 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.764633 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bcf994e-a332-4d30-b965-6c5ddd6d6fa6-cert\") pod \"lws-controller-manager-65bdb464b4-w7ldw\" (UID: \"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.764665 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.764678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6bcf994e-a332-4d30-b965-6c5ddd6d6fa6-manager-config\") pod \"lws-controller-manager-65bdb464b4-w7ldw\" (UID: \"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.765004 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.764697 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwgq4\" (UniqueName: \"kubernetes.io/projected/6bcf994e-a332-4d30-b965-6c5ddd6d6fa6-kube-api-access-dwgq4\") pod \"lws-controller-manager-65bdb464b4-w7ldw\" (UID: \"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.765004 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.764817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6bcf994e-a332-4d30-b965-6c5ddd6d6fa6-metrics-cert\") pod \"lws-controller-manager-65bdb464b4-w7ldw\" (UID: \"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.865448 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.865405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6bcf994e-a332-4d30-b965-6c5ddd6d6fa6-metrics-cert\") pod \"lws-controller-manager-65bdb464b4-w7ldw\" (UID: \"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.865629 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.865554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bcf994e-a332-4d30-b965-6c5ddd6d6fa6-cert\") pod \"lws-controller-manager-65bdb464b4-w7ldw\" (UID: \"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.865629 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.865582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6bcf994e-a332-4d30-b965-6c5ddd6d6fa6-manager-config\") pod \"lws-controller-manager-65bdb464b4-w7ldw\" (UID: \"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.865629 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.865598 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwgq4\" (UniqueName: \"kubernetes.io/projected/6bcf994e-a332-4d30-b965-6c5ddd6d6fa6-kube-api-access-dwgq4\") pod \"lws-controller-manager-65bdb464b4-w7ldw\" (UID: \"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.866297 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.866272 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6bcf994e-a332-4d30-b965-6c5ddd6d6fa6-manager-config\") pod \"lws-controller-manager-65bdb464b4-w7ldw\" (UID: \"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.868037 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.868015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bcf994e-a332-4d30-b965-6c5ddd6d6fa6-cert\") pod \"lws-controller-manager-65bdb464b4-w7ldw\" (UID: \"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.868037 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.868035 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6bcf994e-a332-4d30-b965-6c5ddd6d6fa6-metrics-cert\") pod \"lws-controller-manager-65bdb464b4-w7ldw\" (UID: \"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.886306 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.886271 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwgq4\" (UniqueName: \"kubernetes.io/projected/6bcf994e-a332-4d30-b965-6c5ddd6d6fa6-kube-api-access-dwgq4\") pod \"lws-controller-manager-65bdb464b4-w7ldw\" (UID: \"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:45.998492 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:45.998395 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:46.140773 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:46.140745 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw"] Apr 16 18:19:46.143407 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:19:46.143381 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bcf994e_a332_4d30_b965_6c5ddd6d6fa6.slice/crio-057de55026ea340b0d96d353961c9f62b8499b9ca3bc26e292f01fa5a50f4ee6 WatchSource:0}: Error finding container 057de55026ea340b0d96d353961c9f62b8499b9ca3bc26e292f01fa5a50f4ee6: Status 404 returned error can't find the container with id 057de55026ea340b0d96d353961c9f62b8499b9ca3bc26e292f01fa5a50f4ee6 Apr 16 18:19:46.695380 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:46.695340 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" event={"ID":"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6","Type":"ContainerStarted","Data":"057de55026ea340b0d96d353961c9f62b8499b9ca3bc26e292f01fa5a50f4ee6"} Apr 16 18:19:48.704153 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:48.704054 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" event={"ID":"6bcf994e-a332-4d30-b965-6c5ddd6d6fa6","Type":"ContainerStarted","Data":"77fd14d9ea42a2dff3298e865f0b55cfc13ab95d59926deff59809a854ed01ab"} Apr 16 18:19:48.704153 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:48.704127 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:19:48.723343 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:48.723291 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" podStartSLOduration=1.4332802519999999 podStartE2EDuration="3.723275656s" podCreationTimestamp="2026-04-16 18:19:45 +0000 UTC" firstStartedPulling="2026-04-16 18:19:46.145204716 +0000 UTC m=+605.800439547" lastFinishedPulling="2026-04-16 18:19:48.435200121 +0000 UTC m=+608.090434951" observedRunningTime="2026-04-16 18:19:48.722144755 +0000 UTC m=+608.377379610" watchObservedRunningTime="2026-04-16 18:19:48.723275656 +0000 UTC m=+608.378510581" Apr 16 18:19:59.710047 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:19:59.710008 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-w7ldw" Apr 16 18:20:40.726837 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:40.726801 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7"] Apr 16 18:20:40.729919 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:40.729902 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" Apr 16 18:20:40.732718 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:40.732693 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 18:20:40.732884 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:40.732741 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 18:20:40.734190 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:40.734162 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 18:20:40.734324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:40.734238 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 18:20:40.734324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:40.734269 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4727g\"" Apr 16 18:20:40.737871 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:40.737847 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7"] Apr 16 18:20:40.905873 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:40.905807 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/627ff423-8d04-4247-b465-f4eedd171a6f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-nlvv7\" (UID: \"627ff423-8d04-4247-b465-f4eedd171a6f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" Apr 16 18:20:40.906060 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:40.905894 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65twd\" (UniqueName: \"kubernetes.io/projected/627ff423-8d04-4247-b465-f4eedd171a6f-kube-api-access-65twd\") pod \"kuadrant-console-plugin-6c886788f8-nlvv7\" (UID: \"627ff423-8d04-4247-b465-f4eedd171a6f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" Apr 16 18:20:40.906060 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:40.905967 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/627ff423-8d04-4247-b465-f4eedd171a6f-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-nlvv7\" (UID: \"627ff423-8d04-4247-b465-f4eedd171a6f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" Apr 16 18:20:41.006444 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.006345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/627ff423-8d04-4247-b465-f4eedd171a6f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-nlvv7\" (UID: \"627ff423-8d04-4247-b465-f4eedd171a6f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" Apr 16 18:20:41.006444 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.006398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65twd\" (UniqueName: \"kubernetes.io/projected/627ff423-8d04-4247-b465-f4eedd171a6f-kube-api-access-65twd\") pod \"kuadrant-console-plugin-6c886788f8-nlvv7\" (UID: \"627ff423-8d04-4247-b465-f4eedd171a6f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" Apr 16 18:20:41.006679 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.006456 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/627ff423-8d04-4247-b465-f4eedd171a6f-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-nlvv7\" (UID: \"627ff423-8d04-4247-b465-f4eedd171a6f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" Apr 16 18:20:41.009004 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.008975 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 18:20:41.009156 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.009073 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 18:20:41.015559 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.015538 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 18:20:41.017218 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.017196 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/627ff423-8d04-4247-b465-f4eedd171a6f-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-nlvv7\" (UID: \"627ff423-8d04-4247-b465-f4eedd171a6f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" Apr 16 18:20:41.018794 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.018759 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/627ff423-8d04-4247-b465-f4eedd171a6f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-nlvv7\" (UID: \"627ff423-8d04-4247-b465-f4eedd171a6f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" Apr 16 18:20:41.026474 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.026452 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 18:20:41.036992 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.036965 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65twd\" (UniqueName: \"kubernetes.io/projected/627ff423-8d04-4247-b465-f4eedd171a6f-kube-api-access-65twd\") pod \"kuadrant-console-plugin-6c886788f8-nlvv7\" (UID: \"627ff423-8d04-4247-b465-f4eedd171a6f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" Apr 16 18:20:41.043505 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.043482 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4727g\"" Apr 16 18:20:41.051559 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.051540 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" Apr 16 18:20:41.176540 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.176508 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7"] Apr 16 18:20:41.179677 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:20:41.179646 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod627ff423_8d04_4247_b465_f4eedd171a6f.slice/crio-b8cb439f39b9a0f92dc8dc4811e571adfaa88379fb99b21f2e6f12b1da77557f WatchSource:0}: Error finding container b8cb439f39b9a0f92dc8dc4811e571adfaa88379fb99b21f2e6f12b1da77557f: Status 404 returned error can't find the container with id b8cb439f39b9a0f92dc8dc4811e571adfaa88379fb99b21f2e6f12b1da77557f Apr 16 18:20:41.879155 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:41.879119 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" event={"ID":"627ff423-8d04-4247-b465-f4eedd171a6f","Type":"ContainerStarted","Data":"b8cb439f39b9a0f92dc8dc4811e571adfaa88379fb99b21f2e6f12b1da77557f"} Apr 16 18:20:45.897141 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:45.897105 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" event={"ID":"627ff423-8d04-4247-b465-f4eedd171a6f","Type":"ContainerStarted","Data":"4b6938d352967f5dfa02e4a09115f1c5ed5d77900a48ec0db454076a06fcb38d"} Apr 16 18:20:45.916487 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:20:45.916424 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-nlvv7" podStartSLOduration=1.302379128 podStartE2EDuration="5.916407498s" podCreationTimestamp="2026-04-16 18:20:40 +0000 UTC" firstStartedPulling="2026-04-16 18:20:41.181277286 +0000 UTC m=+660.836512117" lastFinishedPulling="2026-04-16 18:20:45.795305656 +0000 UTC m=+665.450540487" observedRunningTime="2026-04-16 18:20:45.915567608 +0000 UTC m=+665.570802461" watchObservedRunningTime="2026-04-16 18:20:45.916407498 +0000 UTC m=+665.571642353" Apr 16 18:21:26.090543 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:26.090467 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-qpqq8"] Apr 16 18:21:26.093752 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:26.093724 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-qpqq8" Apr 16 18:21:26.096309 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:26.096284 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wmjdn\"" Apr 16 18:21:26.101374 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:26.101342 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qpqq8"] Apr 16 18:21:26.169076 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:26.169042 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tddnz\" (UniqueName: \"kubernetes.io/projected/865c91c8-1511-4877-a706-5131b0d8ee00-kube-api-access-tddnz\") pod \"authorino-674b59b84c-qpqq8\" (UID: \"865c91c8-1511-4877-a706-5131b0d8ee00\") " pod="kuadrant-system/authorino-674b59b84c-qpqq8" Apr 16 18:21:26.269998 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:26.269964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tddnz\" (UniqueName: \"kubernetes.io/projected/865c91c8-1511-4877-a706-5131b0d8ee00-kube-api-access-tddnz\") pod \"authorino-674b59b84c-qpqq8\" (UID: \"865c91c8-1511-4877-a706-5131b0d8ee00\") " pod="kuadrant-system/authorino-674b59b84c-qpqq8" Apr 16 18:21:26.278895 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:26.278859 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tddnz\" (UniqueName: \"kubernetes.io/projected/865c91c8-1511-4877-a706-5131b0d8ee00-kube-api-access-tddnz\") pod \"authorino-674b59b84c-qpqq8\" (UID: \"865c91c8-1511-4877-a706-5131b0d8ee00\") " pod="kuadrant-system/authorino-674b59b84c-qpqq8" Apr 16 18:21:26.407215 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:26.407163 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-qpqq8" Apr 16 18:21:26.532711 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:26.532674 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qpqq8"] Apr 16 18:21:26.535699 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:21:26.535669 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod865c91c8_1511_4877_a706_5131b0d8ee00.slice/crio-56d492a8df827d4b4b3334716618a8c2ed81105e03639a58a8a329da73f0dbf0 WatchSource:0}: Error finding container 56d492a8df827d4b4b3334716618a8c2ed81105e03639a58a8a329da73f0dbf0: Status 404 returned error can't find the container with id 56d492a8df827d4b4b3334716618a8c2ed81105e03639a58a8a329da73f0dbf0 Apr 16 18:21:27.021249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:27.021218 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-qpqq8" event={"ID":"865c91c8-1511-4877-a706-5131b0d8ee00","Type":"ContainerStarted","Data":"56d492a8df827d4b4b3334716618a8c2ed81105e03639a58a8a329da73f0dbf0"} Apr 16 18:21:30.035428 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:30.035387 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-qpqq8" event={"ID":"865c91c8-1511-4877-a706-5131b0d8ee00","Type":"ContainerStarted","Data":"9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25"} Apr 16 18:21:30.050871 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:30.050790 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-qpqq8" podStartSLOduration=1.247679366 podStartE2EDuration="4.050773233s" podCreationTimestamp="2026-04-16 18:21:26 +0000 UTC" firstStartedPulling="2026-04-16 18:21:26.537031379 +0000 UTC m=+706.192266210" lastFinishedPulling="2026-04-16 18:21:29.340125239 +0000 UTC m=+708.995360077" observedRunningTime="2026-04-16 18:21:30.050165859 +0000 UTC m=+709.705400714" watchObservedRunningTime="2026-04-16 18:21:30.050773233 +0000 UTC m=+709.706008087" Apr 16 18:21:32.430360 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:32.430321 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qpqq8"] Apr 16 18:21:32.430801 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:32.430509 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-qpqq8" podUID="865c91c8-1511-4877-a706-5131b0d8ee00" containerName="authorino" containerID="cri-o://9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25" gracePeriod=30 Apr 16 18:21:32.675208 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:32.675178 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-qpqq8" Apr 16 18:21:32.827132 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:32.827039 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tddnz\" (UniqueName: \"kubernetes.io/projected/865c91c8-1511-4877-a706-5131b0d8ee00-kube-api-access-tddnz\") pod \"865c91c8-1511-4877-a706-5131b0d8ee00\" (UID: \"865c91c8-1511-4877-a706-5131b0d8ee00\") " Apr 16 18:21:32.829208 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:32.829173 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865c91c8-1511-4877-a706-5131b0d8ee00-kube-api-access-tddnz" (OuterVolumeSpecName: "kube-api-access-tddnz") pod "865c91c8-1511-4877-a706-5131b0d8ee00" (UID: "865c91c8-1511-4877-a706-5131b0d8ee00"). InnerVolumeSpecName "kube-api-access-tddnz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:32.927908 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:32.927868 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tddnz\" (UniqueName: \"kubernetes.io/projected/865c91c8-1511-4877-a706-5131b0d8ee00-kube-api-access-tddnz\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:21:33.046342 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:33.046307 2574 generic.go:358] "Generic (PLEG): container finished" podID="865c91c8-1511-4877-a706-5131b0d8ee00" containerID="9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25" exitCode=0 Apr 16 18:21:33.046528 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:33.046355 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-qpqq8" Apr 16 18:21:33.046528 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:33.046354 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-qpqq8" event={"ID":"865c91c8-1511-4877-a706-5131b0d8ee00","Type":"ContainerDied","Data":"9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25"} Apr 16 18:21:33.046528 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:33.046478 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-qpqq8" event={"ID":"865c91c8-1511-4877-a706-5131b0d8ee00","Type":"ContainerDied","Data":"56d492a8df827d4b4b3334716618a8c2ed81105e03639a58a8a329da73f0dbf0"} Apr 16 18:21:33.046528 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:33.046509 2574 scope.go:117] "RemoveContainer" containerID="9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25" Apr 16 18:21:33.055094 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:33.055075 2574 scope.go:117] "RemoveContainer" containerID="9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25" Apr 16 18:21:33.055392 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:21:33.055373 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25\": container with ID starting with 9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25 not found: ID does not exist" containerID="9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25" Apr 16 18:21:33.055445 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:33.055401 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25"} err="failed to get container status \"9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25\": rpc error: code = NotFound desc = could not find container \"9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25\": container with ID starting with 9c6a687f3e8004dd8859adf431ebd526ac66984f695f84d7e1687153aec8cf25 not found: ID does not exist" Apr 16 18:21:33.064282 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:33.064254 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qpqq8"] Apr 16 18:21:33.068071 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:33.068047 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qpqq8"] Apr 16 18:21:34.939120 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:34.939084 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865c91c8-1511-4877-a706-5131b0d8ee00" path="/var/lib/kubelet/pods/865c91c8-1511-4877-a706-5131b0d8ee00/volumes" Apr 16 18:21:50.376258 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.376225 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-svbsx"] Apr 16 18:21:50.376625 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.376494 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865c91c8-1511-4877-a706-5131b0d8ee00" containerName="authorino" Apr 16 18:21:50.376625 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.376504 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="865c91c8-1511-4877-a706-5131b0d8ee00" containerName="authorino" Apr 16 18:21:50.376625 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.376558 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="865c91c8-1511-4877-a706-5131b0d8ee00" containerName="authorino" Apr 16 18:21:50.381936 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.381634 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-svbsx" Apr 16 18:21:50.384771 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.384742 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 18:21:50.385628 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.385599 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wmjdn\"" Apr 16 18:21:50.386475 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.386451 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-svbsx"] Apr 16 18:21:50.472729 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.472688 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhd64\" (UniqueName: \"kubernetes.io/projected/040142a4-fe8d-4316-ac7a-7e333dc75e50-kube-api-access-fhd64\") pod \"authorino-68bd676465-svbsx\" (UID: \"040142a4-fe8d-4316-ac7a-7e333dc75e50\") " pod="kuadrant-system/authorino-68bd676465-svbsx" Apr 16 18:21:50.472729 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.472735 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/040142a4-fe8d-4316-ac7a-7e333dc75e50-tls-cert\") pod \"authorino-68bd676465-svbsx\" (UID: \"040142a4-fe8d-4316-ac7a-7e333dc75e50\") " pod="kuadrant-system/authorino-68bd676465-svbsx" Apr 16 18:21:50.574059 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.574015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhd64\" (UniqueName: \"kubernetes.io/projected/040142a4-fe8d-4316-ac7a-7e333dc75e50-kube-api-access-fhd64\") pod \"authorino-68bd676465-svbsx\" (UID: \"040142a4-fe8d-4316-ac7a-7e333dc75e50\") " pod="kuadrant-system/authorino-68bd676465-svbsx" Apr 16 18:21:50.574246 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.574071 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/040142a4-fe8d-4316-ac7a-7e333dc75e50-tls-cert\") pod \"authorino-68bd676465-svbsx\" (UID: \"040142a4-fe8d-4316-ac7a-7e333dc75e50\") " pod="kuadrant-system/authorino-68bd676465-svbsx" Apr 16 18:21:50.576614 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.576595 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/040142a4-fe8d-4316-ac7a-7e333dc75e50-tls-cert\") pod \"authorino-68bd676465-svbsx\" (UID: \"040142a4-fe8d-4316-ac7a-7e333dc75e50\") " pod="kuadrant-system/authorino-68bd676465-svbsx" Apr 16 18:21:50.583797 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.583768 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhd64\" (UniqueName: \"kubernetes.io/projected/040142a4-fe8d-4316-ac7a-7e333dc75e50-kube-api-access-fhd64\") pod \"authorino-68bd676465-svbsx\" (UID: \"040142a4-fe8d-4316-ac7a-7e333dc75e50\") " pod="kuadrant-system/authorino-68bd676465-svbsx" Apr 16 18:21:50.692720 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.692688 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-svbsx" Apr 16 18:21:50.818465 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:50.818417 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-svbsx"] Apr 16 18:21:50.821413 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:21:50.821370 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod040142a4_fe8d_4316_ac7a_7e333dc75e50.slice/crio-009b20fa23bf383019b1b5a063beba1c35024e6b2d2a3494dfbc67cd76788a3c WatchSource:0}: Error finding container 009b20fa23bf383019b1b5a063beba1c35024e6b2d2a3494dfbc67cd76788a3c: Status 404 returned error can't find the container with id 009b20fa23bf383019b1b5a063beba1c35024e6b2d2a3494dfbc67cd76788a3c Apr 16 18:21:51.104340 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:51.104248 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-svbsx" event={"ID":"040142a4-fe8d-4316-ac7a-7e333dc75e50","Type":"ContainerStarted","Data":"009b20fa23bf383019b1b5a063beba1c35024e6b2d2a3494dfbc67cd76788a3c"} Apr 16 18:21:52.109229 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:52.109194 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-svbsx" event={"ID":"040142a4-fe8d-4316-ac7a-7e333dc75e50","Type":"ContainerStarted","Data":"f07c34f5d9477de952ed78960a20a2884272c2d207b80ba3a69888fdad6161cf"} Apr 16 18:21:52.128586 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:21:52.128525 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-svbsx" podStartSLOduration=1.77397354 podStartE2EDuration="2.128504485s" podCreationTimestamp="2026-04-16 18:21:50 +0000 UTC" firstStartedPulling="2026-04-16 18:21:50.822949066 +0000 UTC m=+730.478183897" lastFinishedPulling="2026-04-16 18:21:51.177480008 +0000 UTC m=+730.832714842" observedRunningTime="2026-04-16 18:21:52.127376738 +0000 UTC m=+731.782611595" watchObservedRunningTime="2026-04-16 18:21:52.128504485 +0000 UTC m=+731.783739339" Apr 16 18:22:10.239758 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.239725 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5596b59666-9h2h6"] Apr 16 18:22:10.245601 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.245571 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" Apr 16 18:22:10.248210 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.248181 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:22:10.249275 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.249250 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-dnphx\"" Apr 16 18:22:10.249275 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.249268 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:22:10.249454 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.249316 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:22:10.253719 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.253690 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5596b59666-9h2h6"] Apr 16 18:22:10.332426 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.332382 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-964hv\" (UniqueName: \"kubernetes.io/projected/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8-kube-api-access-964hv\") pod \"llmisvc-controller-manager-5596b59666-9h2h6\" (UID: \"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8\") " pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" Apr 16 18:22:10.332601 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.332450 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8-cert\") pod \"llmisvc-controller-manager-5596b59666-9h2h6\" (UID: \"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8\") " pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" Apr 16 18:22:10.432852 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.432793 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-964hv\" (UniqueName: \"kubernetes.io/projected/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8-kube-api-access-964hv\") pod \"llmisvc-controller-manager-5596b59666-9h2h6\" (UID: \"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8\") " pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" Apr 16 18:22:10.433027 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.432868 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8-cert\") pod \"llmisvc-controller-manager-5596b59666-9h2h6\" (UID: \"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8\") " pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" Apr 16 18:22:10.435251 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.435221 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8-cert\") pod \"llmisvc-controller-manager-5596b59666-9h2h6\" (UID: \"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8\") " pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" Apr 16 18:22:10.443607 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.443575 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-964hv\" (UniqueName: \"kubernetes.io/projected/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8-kube-api-access-964hv\") pod \"llmisvc-controller-manager-5596b59666-9h2h6\" (UID: \"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8\") " pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" Apr 16 18:22:10.559257 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.559164 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" Apr 16 18:22:10.687736 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:10.687710 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5596b59666-9h2h6"] Apr 16 18:22:10.690328 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:22:10.690293 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod77eb27cc_6ac4_4adc_9f9a_bfb84cc33cf8.slice/crio-1018f065a7e9cfc15b5a6a2d8d3fa309fafbe1c6f3065304d10ef893d39702c0 WatchSource:0}: Error finding container 1018f065a7e9cfc15b5a6a2d8d3fa309fafbe1c6f3065304d10ef893d39702c0: Status 404 returned error can't find the container with id 1018f065a7e9cfc15b5a6a2d8d3fa309fafbe1c6f3065304d10ef893d39702c0 Apr 16 18:22:11.169769 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:11.169727 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" event={"ID":"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8","Type":"ContainerStarted","Data":"1018f065a7e9cfc15b5a6a2d8d3fa309fafbe1c6f3065304d10ef893d39702c0"} Apr 16 18:22:15.184125 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:15.184085 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" event={"ID":"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8","Type":"ContainerStarted","Data":"daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547"} Apr 16 18:22:15.184607 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:15.184198 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" Apr 16 18:22:15.202040 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:15.201986 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" podStartSLOduration=1.646302329 podStartE2EDuration="5.201968752s" podCreationTimestamp="2026-04-16 18:22:10 +0000 UTC" firstStartedPulling="2026-04-16 18:22:10.691516626 +0000 UTC m=+750.346751456" lastFinishedPulling="2026-04-16 18:22:14.247183044 +0000 UTC m=+753.902417879" observedRunningTime="2026-04-16 18:22:15.201781855 +0000 UTC m=+754.857016706" watchObservedRunningTime="2026-04-16 18:22:15.201968752 +0000 UTC m=+754.857203605" Apr 16 18:22:46.190297 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:22:46.190266 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" Apr 16 18:24:27.190562 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.190479 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt"] Apr 16 18:24:27.193786 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.193764 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.197390 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.197365 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:24:27.197390 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.197382 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:24:27.197585 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.197365 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:24:27.197585 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.197370 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 16 18:24:27.201959 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.201934 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt"] Apr 16 18:24:27.242902 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.242856 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.243084 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.242924 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.243084 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.242967 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ed67622b-1980-4bb1-acd2-1e40065cd1d4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.243084 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.243043 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.243084 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.243081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n22n\" (UniqueName: \"kubernetes.io/projected/ed67622b-1980-4bb1-acd2-1e40065cd1d4-kube-api-access-4n22n\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.243247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.243124 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.344136 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.344092 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.344136 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.344136 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.344390 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.344169 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ed67622b-1980-4bb1-acd2-1e40065cd1d4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.344390 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.344210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.344390 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.344231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4n22n\" (UniqueName: \"kubernetes.io/projected/ed67622b-1980-4bb1-acd2-1e40065cd1d4-kube-api-access-4n22n\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.344390 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.344273 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.344674 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.344632 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.344786 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.344689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.344786 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.344747 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.346579 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.346552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.346812 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.346793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ed67622b-1980-4bb1-acd2-1e40065cd1d4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.352963 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.352939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n22n\" (UniqueName: \"kubernetes.io/projected/ed67622b-1980-4bb1-acd2-1e40065cd1d4-kube-api-access-4n22n\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.504702 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.504610 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:27.643816 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.643787 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt"] Apr 16 18:24:27.646116 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:24:27.646082 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded67622b_1980_4bb1_acd2_1e40065cd1d4.slice/crio-e254e60aea16ee1c17e8cb2a060c8e18b1aaa1b1b3cd35a4d8580c149d686b96 WatchSource:0}: Error finding container e254e60aea16ee1c17e8cb2a060c8e18b1aaa1b1b3cd35a4d8580c149d686b96: Status 404 returned error can't find the container with id e254e60aea16ee1c17e8cb2a060c8e18b1aaa1b1b3cd35a4d8580c149d686b96 Apr 16 18:24:27.647933 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:27.647912 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:24:28.607463 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:28.607405 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" event={"ID":"ed67622b-1980-4bb1-acd2-1e40065cd1d4","Type":"ContainerStarted","Data":"e254e60aea16ee1c17e8cb2a060c8e18b1aaa1b1b3cd35a4d8580c149d686b96"} Apr 16 18:24:31.621495 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:31.621447 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" event={"ID":"ed67622b-1980-4bb1-acd2-1e40065cd1d4","Type":"ContainerStarted","Data":"b37cd064a04ab78d9e925243c69075319a9336cc6661fcce0f4797d8360647de"} Apr 16 18:24:35.641173 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:35.641134 2574 generic.go:358] "Generic (PLEG): container finished" podID="ed67622b-1980-4bb1-acd2-1e40065cd1d4" containerID="b37cd064a04ab78d9e925243c69075319a9336cc6661fcce0f4797d8360647de" exitCode=0 Apr 16 18:24:35.641669 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:35.641192 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" event={"ID":"ed67622b-1980-4bb1-acd2-1e40065cd1d4","Type":"ContainerDied","Data":"b37cd064a04ab78d9e925243c69075319a9336cc6661fcce0f4797d8360647de"} Apr 16 18:24:37.656537 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:37.656492 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" event={"ID":"ed67622b-1980-4bb1-acd2-1e40065cd1d4","Type":"ContainerStarted","Data":"d3251454e6001307d49f48e629fd402eb547bc05b89bfbe215af1ded22a9b737"} Apr 16 18:24:37.683238 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:37.683175 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" podStartSLOduration=1.537545714 podStartE2EDuration="10.683158652s" podCreationTimestamp="2026-04-16 18:24:27 +0000 UTC" firstStartedPulling="2026-04-16 18:24:27.648077706 +0000 UTC m=+887.303312536" lastFinishedPulling="2026-04-16 18:24:36.793690632 +0000 UTC m=+896.448925474" observedRunningTime="2026-04-16 18:24:37.681031149 +0000 UTC m=+897.336266004" watchObservedRunningTime="2026-04-16 18:24:37.683158652 +0000 UTC m=+897.338393504" Apr 16 18:24:40.865561 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:40.865535 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:24:40.865561 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:40.865548 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:24:40.871422 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:40.871400 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:24:40.871567 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:40.871400 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:24:47.505519 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:47.505479 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:47.506003 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:47.505790 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:47.518437 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:47.518411 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:47.699012 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:47.698983 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:57.539543 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.539504 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt"] Apr 16 18:24:57.539970 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.539767 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" podUID="ed67622b-1980-4bb1-acd2-1e40065cd1d4" containerName="main" containerID="cri-o://d3251454e6001307d49f48e629fd402eb547bc05b89bfbe215af1ded22a9b737" gracePeriod=30 Apr 16 18:24:57.688722 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.688675 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" podUID="ed67622b-1980-4bb1-acd2-1e40065cd1d4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.27:8000/health\": dial tcp 10.133.0.27:8000: connect: connection refused" Apr 16 18:24:57.723476 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.723441 2574 generic.go:358] "Generic (PLEG): container finished" podID="ed67622b-1980-4bb1-acd2-1e40065cd1d4" containerID="d3251454e6001307d49f48e629fd402eb547bc05b89bfbe215af1ded22a9b737" exitCode=0 Apr 16 18:24:57.723651 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.723479 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" event={"ID":"ed67622b-1980-4bb1-acd2-1e40065cd1d4","Type":"ContainerDied","Data":"d3251454e6001307d49f48e629fd402eb547bc05b89bfbe215af1ded22a9b737"} Apr 16 18:24:57.802512 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.802487 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:57.918323 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.918285 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-dshm\") pod \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " Apr 16 18:24:57.918323 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.918330 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ed67622b-1980-4bb1-acd2-1e40065cd1d4-tls-certs\") pod \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " Apr 16 18:24:57.918595 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.918366 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-model-cache\") pod \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " Apr 16 18:24:57.918595 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.918383 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-home\") pod \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " Apr 16 18:24:57.918595 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.918410 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-kserve-provision-location\") pod \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " Apr 16 18:24:57.918595 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.918437 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n22n\" (UniqueName: \"kubernetes.io/projected/ed67622b-1980-4bb1-acd2-1e40065cd1d4-kube-api-access-4n22n\") pod \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\" (UID: \"ed67622b-1980-4bb1-acd2-1e40065cd1d4\") " Apr 16 18:24:57.918798 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.918664 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-home" (OuterVolumeSpecName: "home") pod "ed67622b-1980-4bb1-acd2-1e40065cd1d4" (UID: "ed67622b-1980-4bb1-acd2-1e40065cd1d4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:24:57.918798 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.918676 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-model-cache" (OuterVolumeSpecName: "model-cache") pod "ed67622b-1980-4bb1-acd2-1e40065cd1d4" (UID: "ed67622b-1980-4bb1-acd2-1e40065cd1d4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:24:57.920799 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.920768 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-dshm" (OuterVolumeSpecName: "dshm") pod "ed67622b-1980-4bb1-acd2-1e40065cd1d4" (UID: "ed67622b-1980-4bb1-acd2-1e40065cd1d4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:24:57.920941 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.920792 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed67622b-1980-4bb1-acd2-1e40065cd1d4-kube-api-access-4n22n" (OuterVolumeSpecName: "kube-api-access-4n22n") pod "ed67622b-1980-4bb1-acd2-1e40065cd1d4" (UID: "ed67622b-1980-4bb1-acd2-1e40065cd1d4"). InnerVolumeSpecName "kube-api-access-4n22n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:24:57.920941 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.920768 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed67622b-1980-4bb1-acd2-1e40065cd1d4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ed67622b-1980-4bb1-acd2-1e40065cd1d4" (UID: "ed67622b-1980-4bb1-acd2-1e40065cd1d4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:24:57.973900 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:57.973862 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ed67622b-1980-4bb1-acd2-1e40065cd1d4" (UID: "ed67622b-1980-4bb1-acd2-1e40065cd1d4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:24:58.019252 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.019213 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:24:58.019252 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.019244 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:24:58.019252 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.019256 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:24:58.019499 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.019267 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4n22n\" (UniqueName: \"kubernetes.io/projected/ed67622b-1980-4bb1-acd2-1e40065cd1d4-kube-api-access-4n22n\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:24:58.019499 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.019277 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ed67622b-1980-4bb1-acd2-1e40065cd1d4-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:24:58.019499 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.019286 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ed67622b-1980-4bb1-acd2-1e40065cd1d4-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:24:58.729529 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.729495 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" event={"ID":"ed67622b-1980-4bb1-acd2-1e40065cd1d4","Type":"ContainerDied","Data":"e254e60aea16ee1c17e8cb2a060c8e18b1aaa1b1b3cd35a4d8580c149d686b96"} Apr 16 18:24:58.729993 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.729542 2574 scope.go:117] "RemoveContainer" containerID="d3251454e6001307d49f48e629fd402eb547bc05b89bfbe215af1ded22a9b737" Apr 16 18:24:58.729993 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.729543 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt" Apr 16 18:24:58.737912 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.737814 2574 scope.go:117] "RemoveContainer" containerID="b37cd064a04ab78d9e925243c69075319a9336cc6661fcce0f4797d8360647de" Apr 16 18:24:58.754394 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.754365 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt"] Apr 16 18:24:58.758317 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.758288 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7cdfbcdb79bqrgt"] Apr 16 18:24:58.938333 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:24:58.938291 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed67622b-1980-4bb1-acd2-1e40065cd1d4" path="/var/lib/kubelet/pods/ed67622b-1980-4bb1-acd2-1e40065cd1d4/volumes" Apr 16 18:25:14.795340 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.795306 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg"] Apr 16 18:25:14.795856 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.795718 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed67622b-1980-4bb1-acd2-1e40065cd1d4" containerName="storage-initializer" Apr 16 18:25:14.795856 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.795737 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed67622b-1980-4bb1-acd2-1e40065cd1d4" containerName="storage-initializer" Apr 16 18:25:14.795856 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.795763 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed67622b-1980-4bb1-acd2-1e40065cd1d4" containerName="main" Apr 16 18:25:14.795856 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.795771 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed67622b-1980-4bb1-acd2-1e40065cd1d4" containerName="main" Apr 16 18:25:14.796079 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.795869 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed67622b-1980-4bb1-acd2-1e40065cd1d4" containerName="main" Apr 16 18:25:14.801162 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.801135 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.803913 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.803887 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:25:14.804066 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.804001 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:25:14.804066 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.804007 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:25:14.804191 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.804174 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 18:25:14.809558 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.809531 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg"] Apr 16 18:25:14.868924 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.868880 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x88fz\" (UniqueName: \"kubernetes.io/projected/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-kube-api-access-x88fz\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.869099 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.868937 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.869099 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.868955 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.869099 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.868981 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.869099 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.869055 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.869099 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.869083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.970018 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.969976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x88fz\" (UniqueName: \"kubernetes.io/projected/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-kube-api-access-x88fz\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.970215 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.970038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.970215 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.970057 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.970215 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.970081 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.970215 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.970126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.970215 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.970142 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.970508 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.970486 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.970573 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.970541 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.970627 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.970571 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.972467 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.972441 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.972712 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.972694 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:14.978369 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:14.978342 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x88fz\" (UniqueName: \"kubernetes.io/projected/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-kube-api-access-x88fz\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:15.113659 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:15.113570 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:15.254566 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:15.254523 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg"] Apr 16 18:25:15.258129 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:25:15.258092 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bd89b8d_61eb_4073_9afb_73a4f5678ccf.slice/crio-30e423ce3c12057665e562510515b6113c66fef8467d41382f61a71a25fac8c1 WatchSource:0}: Error finding container 30e423ce3c12057665e562510515b6113c66fef8467d41382f61a71a25fac8c1: Status 404 returned error can't find the container with id 30e423ce3c12057665e562510515b6113c66fef8467d41382f61a71a25fac8c1 Apr 16 18:25:15.790703 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:15.790662 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" event={"ID":"8bd89b8d-61eb-4073-9afb-73a4f5678ccf","Type":"ContainerStarted","Data":"44bb16ecf7f3ae635032af13a766000a782a820bdfb3944e38f1d669ce401dd5"} Apr 16 18:25:15.790703 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:15.790705 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" event={"ID":"8bd89b8d-61eb-4073-9afb-73a4f5678ccf","Type":"ContainerStarted","Data":"30e423ce3c12057665e562510515b6113c66fef8467d41382f61a71a25fac8c1"} Apr 16 18:25:19.806473 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:19.806389 2574 generic.go:358] "Generic (PLEG): container finished" podID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerID="44bb16ecf7f3ae635032af13a766000a782a820bdfb3944e38f1d669ce401dd5" exitCode=0 Apr 16 18:25:19.806473 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:19.806464 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" event={"ID":"8bd89b8d-61eb-4073-9afb-73a4f5678ccf","Type":"ContainerDied","Data":"44bb16ecf7f3ae635032af13a766000a782a820bdfb3944e38f1d669ce401dd5"} Apr 16 18:25:32.992859 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:32.992808 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d"] Apr 16 18:25:33.010603 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.010569 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d"] Apr 16 18:25:33.010844 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.010733 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.014992 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.014956 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-fv66f\"" Apr 16 18:25:33.015233 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.015209 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 18:25:33.135242 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.135204 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcx5q\" (UniqueName: \"kubernetes.io/projected/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-kube-api-access-bcx5q\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.135242 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.135245 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.135502 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.135274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.135502 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.135361 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.135502 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.135407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.135502 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.135442 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.236378 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.236336 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.236567 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.236403 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.236567 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.236443 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.236567 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.236490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcx5q\" (UniqueName: \"kubernetes.io/projected/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-kube-api-access-bcx5q\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.236567 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.236519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.236567 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.236556 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.236846 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.236807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.236913 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.236898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.236972 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.236941 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.237148 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.237124 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.239336 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.239308 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.250121 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.250042 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcx5q\" (UniqueName: \"kubernetes.io/projected/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-kube-api-access-bcx5q\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.323448 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.323393 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:25:33.481263 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.481229 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d"] Apr 16 18:25:33.484206 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:25:33.484175 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8b54db2_2d49_4a12_a7a7_8fa3e6da4002.slice/crio-0e0532a3a4e5bfdce7ef92770703b351b870de576adf3e42c573cdf532a19b24 WatchSource:0}: Error finding container 0e0532a3a4e5bfdce7ef92770703b351b870de576adf3e42c573cdf532a19b24: Status 404 returned error can't find the container with id 0e0532a3a4e5bfdce7ef92770703b351b870de576adf3e42c573cdf532a19b24 Apr 16 18:25:33.875113 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.875023 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" event={"ID":"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002","Type":"ContainerStarted","Data":"71e813557f2f9fc90abe3cbd69e4e66fdd06faa0bfe37ca297f06bbf8c76c920"} Apr 16 18:25:33.875113 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:33.875071 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" event={"ID":"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002","Type":"ContainerStarted","Data":"0e0532a3a4e5bfdce7ef92770703b351b870de576adf3e42c573cdf532a19b24"} Apr 16 18:25:45.926508 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:45.926473 2574 generic.go:358] "Generic (PLEG): container finished" podID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerID="71e813557f2f9fc90abe3cbd69e4e66fdd06faa0bfe37ca297f06bbf8c76c920" exitCode=0 Apr 16 18:25:45.926996 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:45.926561 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" event={"ID":"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002","Type":"ContainerDied","Data":"71e813557f2f9fc90abe3cbd69e4e66fdd06faa0bfe37ca297f06bbf8c76c920"} Apr 16 18:25:46.932724 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:46.932690 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" event={"ID":"8bd89b8d-61eb-4073-9afb-73a4f5678ccf","Type":"ContainerStarted","Data":"c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7"} Apr 16 18:25:46.962441 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:46.961911 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" podStartSLOduration=6.072139689 podStartE2EDuration="32.961890778s" podCreationTimestamp="2026-04-16 18:25:14 +0000 UTC" firstStartedPulling="2026-04-16 18:25:19.807636392 +0000 UTC m=+939.462871223" lastFinishedPulling="2026-04-16 18:25:46.697387468 +0000 UTC m=+966.352622312" observedRunningTime="2026-04-16 18:25:46.961579306 +0000 UTC m=+966.616814182" watchObservedRunningTime="2026-04-16 18:25:46.961890778 +0000 UTC m=+966.617125631" Apr 16 18:25:47.937961 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:47.937878 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" event={"ID":"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002","Type":"ContainerStarted","Data":"986dd0c01b7fd35de5950d493cbcce18494224f4e34f7ccb42a065745353fdfb"} Apr 16 18:25:55.114090 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:55.113992 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:55.114090 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:55.114049 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:25:55.116011 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:25:55.115975 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 16 18:26:05.114389 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:05.114338 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 16 18:26:15.114179 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:15.114125 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 16 18:26:21.077438 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:21.077342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" event={"ID":"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002","Type":"ContainerStarted","Data":"53ac04d87e37e3e6ed30fdd9d213faeeed901cfe022412e12dc6bf9f69f59cdd"} Apr 16 18:26:21.078280 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:21.078244 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:26:21.086500 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:21.086463 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:26:21.103274 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:21.103197 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" podStartSLOduration=14.203408643 podStartE2EDuration="49.103176235s" podCreationTimestamp="2026-04-16 18:25:32 +0000 UTC" firstStartedPulling="2026-04-16 18:25:45.928033418 +0000 UTC m=+965.583268254" lastFinishedPulling="2026-04-16 18:26:20.827801001 +0000 UTC m=+1000.483035846" observedRunningTime="2026-04-16 18:26:21.102715943 +0000 UTC m=+1000.757950816" watchObservedRunningTime="2026-04-16 18:26:21.103176235 +0000 UTC m=+1000.758411089" Apr 16 18:26:22.082931 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:22.082890 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:26:23.324277 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:23.324233 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:26:23.324751 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:23.324291 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:26:23.326036 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:23.326000 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:26:24.522214 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.522181 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g"] Apr 16 18:26:24.525848 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.525808 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.528420 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.528397 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 18:26:24.535095 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.535055 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g"] Apr 16 18:26:24.627873 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.627817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6542733d-d14b-47d9-8d61-66481bb2d380-tls-certs\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.628067 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.627918 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-model-cache\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.628067 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.627968 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-home\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.628067 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.628011 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6p96\" (UniqueName: \"kubernetes.io/projected/6542733d-d14b-47d9-8d61-66481bb2d380-kube-api-access-t6p96\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.628067 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.628055 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-dshm\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.628222 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.628114 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.729327 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.729278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6p96\" (UniqueName: \"kubernetes.io/projected/6542733d-d14b-47d9-8d61-66481bb2d380-kube-api-access-t6p96\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.729521 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.729341 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-dshm\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.729521 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.729421 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.729521 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.729490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6542733d-d14b-47d9-8d61-66481bb2d380-tls-certs\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.729521 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.729519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-model-cache\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.729739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.729663 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-home\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.729865 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.729840 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.730022 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.729990 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-model-cache\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.730111 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.730021 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-home\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.732032 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.732005 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-dshm\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.732361 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.732332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6542733d-d14b-47d9-8d61-66481bb2d380-tls-certs\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.742657 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.742610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6p96\" (UniqueName: \"kubernetes.io/projected/6542733d-d14b-47d9-8d61-66481bb2d380-kube-api-access-t6p96\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-8g28g\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:24.837920 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:24.837808 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:25.011073 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:26:25.011026 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6542733d_d14b_47d9_8d61_66481bb2d380.slice/crio-8a49fe055fcb1d66addaa2ab74489643b6b9b0067c09b8044531f79ee7240df9 WatchSource:0}: Error finding container 8a49fe055fcb1d66addaa2ab74489643b6b9b0067c09b8044531f79ee7240df9: Status 404 returned error can't find the container with id 8a49fe055fcb1d66addaa2ab74489643b6b9b0067c09b8044531f79ee7240df9 Apr 16 18:26:25.012129 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:25.012066 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g"] Apr 16 18:26:25.093567 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:25.093457 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" event={"ID":"6542733d-d14b-47d9-8d61-66481bb2d380","Type":"ContainerStarted","Data":"f8ee38dcc1669f1eab66891d82e554f3754a2070d8d633de707c6682bab2e1c5"} Apr 16 18:26:25.093567 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:25.093514 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" event={"ID":"6542733d-d14b-47d9-8d61-66481bb2d380","Type":"ContainerStarted","Data":"8a49fe055fcb1d66addaa2ab74489643b6b9b0067c09b8044531f79ee7240df9"} Apr 16 18:26:25.114649 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:25.114588 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 16 18:26:26.978705 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:26.977143 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d"] Apr 16 18:26:26.978705 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:26.977668 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="main" containerID="cri-o://986dd0c01b7fd35de5950d493cbcce18494224f4e34f7ccb42a065745353fdfb" gracePeriod=30 Apr 16 18:26:26.978705 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:26.978215 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="tokenizer" containerID="cri-o://53ac04d87e37e3e6ed30fdd9d213faeeed901cfe022412e12dc6bf9f69f59cdd" gracePeriod=30 Apr 16 18:26:26.988102 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:26.988050 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:26:27.104120 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:27.104072 2574 generic.go:358] "Generic (PLEG): container finished" podID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerID="986dd0c01b7fd35de5950d493cbcce18494224f4e34f7ccb42a065745353fdfb" exitCode=0 Apr 16 18:26:27.104277 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:27.104149 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" event={"ID":"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002","Type":"ContainerDied","Data":"986dd0c01b7fd35de5950d493cbcce18494224f4e34f7ccb42a065745353fdfb"} Apr 16 18:26:30.115433 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:30.115379 2574 generic.go:358] "Generic (PLEG): container finished" podID="6542733d-d14b-47d9-8d61-66481bb2d380" containerID="f8ee38dcc1669f1eab66891d82e554f3754a2070d8d633de707c6682bab2e1c5" exitCode=0 Apr 16 18:26:30.115810 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:30.115452 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" event={"ID":"6542733d-d14b-47d9-8d61-66481bb2d380","Type":"ContainerDied","Data":"f8ee38dcc1669f1eab66891d82e554f3754a2070d8d633de707c6682bab2e1c5"} Apr 16 18:26:31.120490 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:31.120457 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" event={"ID":"6542733d-d14b-47d9-8d61-66481bb2d380","Type":"ContainerStarted","Data":"05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c"} Apr 16 18:26:31.143587 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:31.143537 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" podStartSLOduration=7.143516817 podStartE2EDuration="7.143516817s" podCreationTimestamp="2026-04-16 18:26:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:26:31.14224088 +0000 UTC m=+1010.797475762" watchObservedRunningTime="2026-04-16 18:26:31.143516817 +0000 UTC m=+1010.798751670" Apr 16 18:26:34.838263 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:34.838224 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:34.838705 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:34.838372 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:34.851128 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:34.851094 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:35.114683 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:35.114589 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 16 18:26:35.152125 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:35.152089 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:36.979273 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:26:36.979238 2574 logging.go:55] [core] [Channel #32 SubChannel #33]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.29:9003", ServerName: "10.133.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.29:9003: connect: connection refused" Apr 16 18:26:37.979610 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:37.979566 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.29:9003\" within 1s: context deadline exceeded" Apr 16 18:26:45.114226 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:45.114177 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 16 18:26:46.978949 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:26:46.978910 2574 logging.go:55] [core] [Channel #34 SubChannel #35]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.29:9003", ServerName: "10.133.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.29:9003: connect: connection refused" Apr 16 18:26:47.978797 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:47.978754 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.29:9003\" within 1s: context deadline exceeded" Apr 16 18:26:55.114629 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:55.114520 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 16 18:26:56.979358 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:26:56.979323 2574 logging.go:55] [core] [Channel #36 SubChannel #37]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.29:9003", ServerName: "10.133.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.29:9003: connect: connection refused" Apr 16 18:26:57.218262 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.218223 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d_f8b54db2-2d49-4a12-a7a7-8fa3e6da4002/tokenizer/0.log" Apr 16 18:26:57.218988 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.218955 2574 generic.go:358] "Generic (PLEG): container finished" podID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerID="53ac04d87e37e3e6ed30fdd9d213faeeed901cfe022412e12dc6bf9f69f59cdd" exitCode=137 Apr 16 18:26:57.219120 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.219031 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" event={"ID":"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002","Type":"ContainerDied","Data":"53ac04d87e37e3e6ed30fdd9d213faeeed901cfe022412e12dc6bf9f69f59cdd"} Apr 16 18:26:57.589327 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.589294 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g"] Apr 16 18:26:57.589735 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.589687 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" podUID="6542733d-d14b-47d9-8d61-66481bb2d380" containerName="main" containerID="cri-o://05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c" gracePeriod=30 Apr 16 18:26:57.716051 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.716013 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d_f8b54db2-2d49-4a12-a7a7-8fa3e6da4002/tokenizer/0.log" Apr 16 18:26:57.716858 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.716808 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:26:57.825998 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.825958 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-kserve-provision-location\") pod \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " Apr 16 18:26:57.826195 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.826025 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-tmp\") pod \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " Apr 16 18:26:57.826195 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.826085 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tls-certs\") pod \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " Apr 16 18:26:57.826195 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.826127 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-cache\") pod \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " Apr 16 18:26:57.826195 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.826167 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcx5q\" (UniqueName: \"kubernetes.io/projected/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-kube-api-access-bcx5q\") pod \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " Apr 16 18:26:57.826195 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.826193 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-uds\") pod \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\" (UID: \"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002\") " Apr 16 18:26:57.826844 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.826738 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" (UID: "f8b54db2-2d49-4a12-a7a7-8fa3e6da4002"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:57.826844 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.826753 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" (UID: "f8b54db2-2d49-4a12-a7a7-8fa3e6da4002"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:57.827167 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.827144 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" (UID: "f8b54db2-2d49-4a12-a7a7-8fa3e6da4002"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:57.827365 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.827306 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" (UID: "f8b54db2-2d49-4a12-a7a7-8fa3e6da4002"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:57.828789 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.828759 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" (UID: "f8b54db2-2d49-4a12-a7a7-8fa3e6da4002"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:26:57.829066 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.829040 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-kube-api-access-bcx5q" (OuterVolumeSpecName: "kube-api-access-bcx5q") pod "f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" (UID: "f8b54db2-2d49-4a12-a7a7-8fa3e6da4002"). InnerVolumeSpecName "kube-api-access-bcx5q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:26:57.848225 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.848202 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:57.927615 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.927571 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-model-cache\") pod \"6542733d-d14b-47d9-8d61-66481bb2d380\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " Apr 16 18:26:57.927786 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.927651 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6542733d-d14b-47d9-8d61-66481bb2d380-tls-certs\") pod \"6542733d-d14b-47d9-8d61-66481bb2d380\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " Apr 16 18:26:57.927786 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.927687 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-kserve-provision-location\") pod \"6542733d-d14b-47d9-8d61-66481bb2d380\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " Apr 16 18:26:57.927786 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.927739 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-dshm\") pod \"6542733d-d14b-47d9-8d61-66481bb2d380\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " Apr 16 18:26:57.927974 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.927792 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-home\") pod \"6542733d-d14b-47d9-8d61-66481bb2d380\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " Apr 16 18:26:57.927974 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.927856 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6p96\" (UniqueName: \"kubernetes.io/projected/6542733d-d14b-47d9-8d61-66481bb2d380-kube-api-access-t6p96\") pod \"6542733d-d14b-47d9-8d61-66481bb2d380\" (UID: \"6542733d-d14b-47d9-8d61-66481bb2d380\") " Apr 16 18:26:57.928081 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.927983 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-model-cache" (OuterVolumeSpecName: "model-cache") pod "6542733d-d14b-47d9-8d61-66481bb2d380" (UID: "6542733d-d14b-47d9-8d61-66481bb2d380"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:57.928367 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.928176 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:26:57.928367 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.928205 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:26:57.928367 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.928220 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-tmp\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:26:57.928367 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.928234 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:26:57.928367 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.928246 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:26:57.928367 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.928262 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bcx5q\" (UniqueName: \"kubernetes.io/projected/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-kube-api-access-bcx5q\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:26:57.928367 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.928281 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002-tokenizer-uds\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:26:57.928367 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.928311 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-home" (OuterVolumeSpecName: "home") pod "6542733d-d14b-47d9-8d61-66481bb2d380" (UID: "6542733d-d14b-47d9-8d61-66481bb2d380"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:57.930278 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.930241 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6542733d-d14b-47d9-8d61-66481bb2d380-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6542733d-d14b-47d9-8d61-66481bb2d380" (UID: "6542733d-d14b-47d9-8d61-66481bb2d380"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:26:57.930563 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.930540 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-dshm" (OuterVolumeSpecName: "dshm") pod "6542733d-d14b-47d9-8d61-66481bb2d380" (UID: "6542733d-d14b-47d9-8d61-66481bb2d380"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:57.930670 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.930654 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6542733d-d14b-47d9-8d61-66481bb2d380-kube-api-access-t6p96" (OuterVolumeSpecName: "kube-api-access-t6p96") pod "6542733d-d14b-47d9-8d61-66481bb2d380" (UID: "6542733d-d14b-47d9-8d61-66481bb2d380"). InnerVolumeSpecName "kube-api-access-t6p96". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:26:57.980147 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.980094 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.29:9003\" within 1s: context deadline exceeded" Apr 16 18:26:57.985845 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:57.985798 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6542733d-d14b-47d9-8d61-66481bb2d380" (UID: "6542733d-d14b-47d9-8d61-66481bb2d380"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:58.029596 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.029511 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6542733d-d14b-47d9-8d61-66481bb2d380-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:26:58.029596 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.029544 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:26:58.029596 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.029558 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:26:58.029596 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.029568 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6542733d-d14b-47d9-8d61-66481bb2d380-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:26:58.029596 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.029578 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t6p96\" (UniqueName: \"kubernetes.io/projected/6542733d-d14b-47d9-8d61-66481bb2d380-kube-api-access-t6p96\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:26:58.223719 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.223678 2574 generic.go:358] "Generic (PLEG): container finished" podID="6542733d-d14b-47d9-8d61-66481bb2d380" containerID="05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c" exitCode=0 Apr 16 18:26:58.223929 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.223754 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" Apr 16 18:26:58.223929 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.223752 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" event={"ID":"6542733d-d14b-47d9-8d61-66481bb2d380","Type":"ContainerDied","Data":"05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c"} Apr 16 18:26:58.223929 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.223792 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g" event={"ID":"6542733d-d14b-47d9-8d61-66481bb2d380","Type":"ContainerDied","Data":"8a49fe055fcb1d66addaa2ab74489643b6b9b0067c09b8044531f79ee7240df9"} Apr 16 18:26:58.223929 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.223814 2574 scope.go:117] "RemoveContainer" containerID="05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c" Apr 16 18:26:58.225242 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.225211 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d_f8b54db2-2d49-4a12-a7a7-8fa3e6da4002/tokenizer/0.log" Apr 16 18:26:58.226071 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.226044 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" event={"ID":"f8b54db2-2d49-4a12-a7a7-8fa3e6da4002","Type":"ContainerDied","Data":"0e0532a3a4e5bfdce7ef92770703b351b870de576adf3e42c573cdf532a19b24"} Apr 16 18:26:58.226211 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.226080 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d" Apr 16 18:26:58.240366 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.240336 2574 scope.go:117] "RemoveContainer" containerID="f8ee38dcc1669f1eab66891d82e554f3754a2070d8d633de707c6682bab2e1c5" Apr 16 18:26:58.257407 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.257365 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g"] Apr 16 18:26:58.262835 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.262790 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-8g28g"] Apr 16 18:26:58.276676 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.276633 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d"] Apr 16 18:26:58.282509 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.282408 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96rgb8d"] Apr 16 18:26:58.310455 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.310431 2574 scope.go:117] "RemoveContainer" containerID="05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c" Apr 16 18:26:58.310877 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:26:58.310853 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c\": container with ID starting with 05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c not found: ID does not exist" containerID="05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c" Apr 16 18:26:58.310939 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.310889 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c"} err="failed to get container status \"05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c\": rpc error: code = NotFound desc = could not find container \"05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c\": container with ID starting with 05f0aaf6a3ac25763098f6addd22bceef05762092c680c3e873aae2cbfe5590c not found: ID does not exist" Apr 16 18:26:58.310939 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.310910 2574 scope.go:117] "RemoveContainer" containerID="f8ee38dcc1669f1eab66891d82e554f3754a2070d8d633de707c6682bab2e1c5" Apr 16 18:26:58.311218 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:26:58.311200 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ee38dcc1669f1eab66891d82e554f3754a2070d8d633de707c6682bab2e1c5\": container with ID starting with f8ee38dcc1669f1eab66891d82e554f3754a2070d8d633de707c6682bab2e1c5 not found: ID does not exist" containerID="f8ee38dcc1669f1eab66891d82e554f3754a2070d8d633de707c6682bab2e1c5" Apr 16 18:26:58.311336 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.311224 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ee38dcc1669f1eab66891d82e554f3754a2070d8d633de707c6682bab2e1c5"} err="failed to get container status \"f8ee38dcc1669f1eab66891d82e554f3754a2070d8d633de707c6682bab2e1c5\": rpc error: code = NotFound desc = could not find container \"f8ee38dcc1669f1eab66891d82e554f3754a2070d8d633de707c6682bab2e1c5\": container with ID starting with f8ee38dcc1669f1eab66891d82e554f3754a2070d8d633de707c6682bab2e1c5 not found: ID does not exist" Apr 16 18:26:58.311336 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.311241 2574 scope.go:117] "RemoveContainer" containerID="53ac04d87e37e3e6ed30fdd9d213faeeed901cfe022412e12dc6bf9f69f59cdd" Apr 16 18:26:58.319408 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.319387 2574 scope.go:117] "RemoveContainer" containerID="986dd0c01b7fd35de5950d493cbcce18494224f4e34f7ccb42a065745353fdfb" Apr 16 18:26:58.328503 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.328482 2574 scope.go:117] "RemoveContainer" containerID="71e813557f2f9fc90abe3cbd69e4e66fdd06faa0bfe37ca297f06bbf8c76c920" Apr 16 18:26:58.938167 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.938136 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6542733d-d14b-47d9-8d61-66481bb2d380" path="/var/lib/kubelet/pods/6542733d-d14b-47d9-8d61-66481bb2d380/volumes" Apr 16 18:26:58.938570 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:26:58.938557 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" path="/var/lib/kubelet/pods/f8b54db2-2d49-4a12-a7a7-8fa3e6da4002/volumes" Apr 16 18:27:05.114205 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:05.114162 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 16 18:27:15.114686 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:15.114640 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 16 18:27:23.100627 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.100528 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv"] Apr 16 18:27:23.101135 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101059 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6542733d-d14b-47d9-8d61-66481bb2d380" containerName="storage-initializer" Apr 16 18:27:23.101135 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101080 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6542733d-d14b-47d9-8d61-66481bb2d380" containerName="storage-initializer" Apr 16 18:27:23.101135 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101101 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="tokenizer" Apr 16 18:27:23.101135 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101113 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="tokenizer" Apr 16 18:27:23.101135 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101126 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="storage-initializer" Apr 16 18:27:23.101135 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101136 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="storage-initializer" Apr 16 18:27:23.101452 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101160 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6542733d-d14b-47d9-8d61-66481bb2d380" containerName="main" Apr 16 18:27:23.101452 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101169 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6542733d-d14b-47d9-8d61-66481bb2d380" containerName="main" Apr 16 18:27:23.101452 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101182 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="main" Apr 16 18:27:23.101452 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101191 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="main" Apr 16 18:27:23.101452 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101293 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="tokenizer" Apr 16 18:27:23.101452 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101308 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6542733d-d14b-47d9-8d61-66481bb2d380" containerName="main" Apr 16 18:27:23.101452 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.101323 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8b54db2-2d49-4a12-a7a7-8fa3e6da4002" containerName="main" Apr 16 18:27:23.106264 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.106227 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.109093 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.109065 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-round-trip-kserve-self-signed-certs\"" Apr 16 18:27:23.115782 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.115749 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv"] Apr 16 18:27:23.250536 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.250504 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-dshm\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.250714 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.250544 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-kserve-provision-location\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.250714 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.250574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtrnr\" (UniqueName: \"kubernetes.io/projected/d5613941-d5be-46e8-9a2b-8d88944b4b95-kube-api-access-mtrnr\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.250714 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.250696 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5613941-d5be-46e8-9a2b-8d88944b4b95-tls-certs\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.250886 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.250754 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-home\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.250886 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.250787 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-model-cache\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.351748 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.351640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-dshm\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.351955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.351807 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-kserve-provision-location\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.351955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.351880 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtrnr\" (UniqueName: \"kubernetes.io/projected/d5613941-d5be-46e8-9a2b-8d88944b4b95-kube-api-access-mtrnr\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.351955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.351921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5613941-d5be-46e8-9a2b-8d88944b4b95-tls-certs\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.351955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.351954 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-home\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.352171 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.351979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-model-cache\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.352294 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.352227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-kserve-provision-location\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.352357 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.352297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-home\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.352418 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.352389 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-model-cache\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.354115 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.354092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-dshm\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.354408 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.354389 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5613941-d5be-46e8-9a2b-8d88944b4b95-tls-certs\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.362006 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.361977 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtrnr\" (UniqueName: \"kubernetes.io/projected/d5613941-d5be-46e8-9a2b-8d88944b4b95-kube-api-access-mtrnr\") pod \"conv-test-round-trip-kserve-c84d7687-b4pvv\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.419955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.419916 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:23.568867 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:23.568695 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv"] Apr 16 18:27:23.571171 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:27:23.571133 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5613941_d5be_46e8_9a2b_8d88944b4b95.slice/crio-ace73e02b27409f95cd105dc60f1dca90a0f15ef3241778c3c2629e5499f516a WatchSource:0}: Error finding container ace73e02b27409f95cd105dc60f1dca90a0f15ef3241778c3c2629e5499f516a: Status 404 returned error can't find the container with id ace73e02b27409f95cd105dc60f1dca90a0f15ef3241778c3c2629e5499f516a Apr 16 18:27:24.319305 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:24.319268 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" event={"ID":"d5613941-d5be-46e8-9a2b-8d88944b4b95","Type":"ContainerStarted","Data":"7ac082d0e6e472bcd31d39d72fe6d8f443f04cc741e1411b84d66f0a7ecc4061"} Apr 16 18:27:24.319305 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:24.319305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" event={"ID":"d5613941-d5be-46e8-9a2b-8d88944b4b95","Type":"ContainerStarted","Data":"ace73e02b27409f95cd105dc60f1dca90a0f15ef3241778c3c2629e5499f516a"} Apr 16 18:27:25.114489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:25.114438 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" probeResult="failure" output="Get \"https://10.133.0.28:8000/health\": dial tcp 10.133.0.28:8000: connect: connection refused" Apr 16 18:27:28.338475 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:28.338435 2574 generic.go:358] "Generic (PLEG): container finished" podID="d5613941-d5be-46e8-9a2b-8d88944b4b95" containerID="7ac082d0e6e472bcd31d39d72fe6d8f443f04cc741e1411b84d66f0a7ecc4061" exitCode=0 Apr 16 18:27:28.338887 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:28.338484 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" event={"ID":"d5613941-d5be-46e8-9a2b-8d88944b4b95","Type":"ContainerDied","Data":"7ac082d0e6e472bcd31d39d72fe6d8f443f04cc741e1411b84d66f0a7ecc4061"} Apr 16 18:27:29.344724 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:29.344677 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" event={"ID":"d5613941-d5be-46e8-9a2b-8d88944b4b95","Type":"ContainerStarted","Data":"5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9"} Apr 16 18:27:29.372373 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:29.372308 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" podStartSLOduration=6.372286843 podStartE2EDuration="6.372286843s" podCreationTimestamp="2026-04-16 18:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:27:29.368476989 +0000 UTC m=+1069.023711874" watchObservedRunningTime="2026-04-16 18:27:29.372286843 +0000 UTC m=+1069.027521697" Apr 16 18:27:33.420733 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:33.420692 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:33.420733 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:33.420747 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:27:33.422417 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:33.422388 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" podUID="d5613941-d5be-46e8-9a2b-8d88944b4b95" containerName="main" probeResult="failure" output="Get \"https://10.133.0.31:8000/health\": dial tcp 10.133.0.31:8000: connect: connection refused" Apr 16 18:27:35.124112 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:35.124078 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:27:35.131940 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:35.131908 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:27:38.298518 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.298484 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd"] Apr 16 18:27:38.332837 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.332793 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd"] Apr 16 18:27:38.332994 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.332937 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.335752 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.335729 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 18:27:38.383761 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.383718 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-dshm\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.383969 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.383771 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8771fa29-9e98-4225-bc04-bd5621501a57-tls-certs\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.383969 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.383805 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lncb\" (UniqueName: \"kubernetes.io/projected/8771fa29-9e98-4225-bc04-bd5621501a57-kube-api-access-2lncb\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.383969 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.383863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-model-cache\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.383969 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.383891 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-kserve-provision-location\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.383969 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.383950 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-home\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.484579 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.484531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-home\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.484752 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.484612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-dshm\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.484752 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.484650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8771fa29-9e98-4225-bc04-bd5621501a57-tls-certs\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.484850 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.484798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lncb\" (UniqueName: \"kubernetes.io/projected/8771fa29-9e98-4225-bc04-bd5621501a57-kube-api-access-2lncb\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.484911 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.484898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-model-cache\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.484962 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.484936 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-kserve-provision-location\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.485060 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.485035 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-home\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.485287 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.485266 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-model-cache\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.485359 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.485320 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-kserve-provision-location\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.486977 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.486951 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-dshm\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.487508 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.487492 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8771fa29-9e98-4225-bc04-bd5621501a57-tls-certs\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.495684 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.495659 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lncb\" (UniqueName: \"kubernetes.io/projected/8771fa29-9e98-4225-bc04-bd5621501a57-kube-api-access-2lncb\") pod \"stop-feature-test-kserve-7fb68447c8-688vd\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.646446 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.646405 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:38.789022 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:38.788994 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd"] Apr 16 18:27:38.790528 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:27:38.790499 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8771fa29_9e98_4225_bc04_bd5621501a57.slice/crio-e16dc805729d1257d64f19e4c13c20d1c498b3970d8f17ca0013aebcba3389fb WatchSource:0}: Error finding container e16dc805729d1257d64f19e4c13c20d1c498b3970d8f17ca0013aebcba3389fb: Status 404 returned error can't find the container with id e16dc805729d1257d64f19e4c13c20d1c498b3970d8f17ca0013aebcba3389fb Apr 16 18:27:39.381316 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:39.381211 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" event={"ID":"8771fa29-9e98-4225-bc04-bd5621501a57","Type":"ContainerStarted","Data":"52ed64967e895443efa80e30445716ab3e434f2b8e6d8093d29b5e2ecc48178d"} Apr 16 18:27:39.381316 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:39.381261 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" event={"ID":"8771fa29-9e98-4225-bc04-bd5621501a57","Type":"ContainerStarted","Data":"e16dc805729d1257d64f19e4c13c20d1c498b3970d8f17ca0013aebcba3389fb"} Apr 16 18:27:41.576850 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:41.576786 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv"] Apr 16 18:27:41.577356 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:41.577179 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" podUID="d5613941-d5be-46e8-9a2b-8d88944b4b95" containerName="main" containerID="cri-o://5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9" gracePeriod=30 Apr 16 18:27:44.401703 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:44.401661 2574 generic.go:358] "Generic (PLEG): container finished" podID="8771fa29-9e98-4225-bc04-bd5621501a57" containerID="52ed64967e895443efa80e30445716ab3e434f2b8e6d8093d29b5e2ecc48178d" exitCode=0 Apr 16 18:27:44.402107 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:44.401715 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" event={"ID":"8771fa29-9e98-4225-bc04-bd5621501a57","Type":"ContainerDied","Data":"52ed64967e895443efa80e30445716ab3e434f2b8e6d8093d29b5e2ecc48178d"} Apr 16 18:27:45.406930 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:45.406895 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" event={"ID":"8771fa29-9e98-4225-bc04-bd5621501a57","Type":"ContainerStarted","Data":"8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0"} Apr 16 18:27:45.431107 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:45.430489 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podStartSLOduration=7.430468179 podStartE2EDuration="7.430468179s" podCreationTimestamp="2026-04-16 18:27:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:27:45.42740634 +0000 UTC m=+1085.082641213" watchObservedRunningTime="2026-04-16 18:27:45.430468179 +0000 UTC m=+1085.085703037" Apr 16 18:27:48.646961 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:48.646917 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:48.646961 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:48.646971 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:27:48.648445 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:48.648410 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 18:27:56.098763 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:56.097999 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg"] Apr 16 18:27:56.098763 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:56.098394 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" containerID="cri-o://c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7" gracePeriod=30 Apr 16 18:27:58.647952 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:27:58.647897 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 18:28:08.102979 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.102915 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964"] Apr 16 18:28:08.106706 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.106684 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.109560 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.109535 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 18:28:08.119652 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.119624 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964"] Apr 16 18:28:08.271211 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.271174 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-tls-certs\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.271402 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.271220 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.271402 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.271250 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-dshm\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.271402 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.271338 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-model-cache\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.271528 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.271404 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-home\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.271528 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.271458 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bj79\" (UniqueName: \"kubernetes.io/projected/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-kube-api-access-6bj79\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.373054 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.372238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-model-cache\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.373054 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.372310 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-home\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.373054 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.372380 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bj79\" (UniqueName: \"kubernetes.io/projected/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-kube-api-access-6bj79\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.373054 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.372442 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-tls-certs\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.373054 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.372472 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.373054 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.372503 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-dshm\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.373523 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.373074 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-home\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.373523 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.373100 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-model-cache\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.373738 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.373712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.375729 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.375701 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-dshm\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.375978 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.375954 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-tls-certs\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.382356 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.382329 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bj79\" (UniqueName: \"kubernetes.io/projected/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-kube-api-access-6bj79\") pod \"custom-route-timeout-test-kserve-5fc8467cf8-pf964\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.422263 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.422219 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:08.571760 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.571697 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964"] Apr 16 18:28:08.575231 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:28:08.575195 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d4b19f7_c2c6_43f0_a27f_8b1bef8edabd.slice/crio-d93c223156d50287f7637dbc1794afea922a3329442638e184c2e9055226deef WatchSource:0}: Error finding container d93c223156d50287f7637dbc1794afea922a3329442638e184c2e9055226deef: Status 404 returned error can't find the container with id d93c223156d50287f7637dbc1794afea922a3329442638e184c2e9055226deef Apr 16 18:28:08.646972 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:08.646928 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 18:28:09.498151 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:09.498111 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" event={"ID":"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd","Type":"ContainerStarted","Data":"5d6ea4601f2c87e22e6794c863dc190f076286cf28225309574edcbc3f9dff87"} Apr 16 18:28:09.498151 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:09.498158 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" event={"ID":"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd","Type":"ContainerStarted","Data":"d93c223156d50287f7637dbc1794afea922a3329442638e184c2e9055226deef"} Apr 16 18:28:11.854167 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:11.854132 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-c84d7687-b4pvv_d5613941-d5be-46e8-9a2b-8d88944b4b95/main/0.log" Apr 16 18:28:11.854672 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:11.854654 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:28:12.008957 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.008907 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-home\") pod \"d5613941-d5be-46e8-9a2b-8d88944b4b95\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " Apr 16 18:28:12.009156 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.009034 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtrnr\" (UniqueName: \"kubernetes.io/projected/d5613941-d5be-46e8-9a2b-8d88944b4b95-kube-api-access-mtrnr\") pod \"d5613941-d5be-46e8-9a2b-8d88944b4b95\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " Apr 16 18:28:12.009156 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.009068 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-model-cache\") pod \"d5613941-d5be-46e8-9a2b-8d88944b4b95\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " Apr 16 18:28:12.009156 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.009108 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5613941-d5be-46e8-9a2b-8d88944b4b95-tls-certs\") pod \"d5613941-d5be-46e8-9a2b-8d88944b4b95\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " Apr 16 18:28:12.009156 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.009140 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-dshm\") pod \"d5613941-d5be-46e8-9a2b-8d88944b4b95\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " Apr 16 18:28:12.009648 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.009169 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-kserve-provision-location\") pod \"d5613941-d5be-46e8-9a2b-8d88944b4b95\" (UID: \"d5613941-d5be-46e8-9a2b-8d88944b4b95\") " Apr 16 18:28:12.009648 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.009326 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-model-cache" (OuterVolumeSpecName: "model-cache") pod "d5613941-d5be-46e8-9a2b-8d88944b4b95" (UID: "d5613941-d5be-46e8-9a2b-8d88944b4b95"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:12.009648 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.009418 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-home" (OuterVolumeSpecName: "home") pod "d5613941-d5be-46e8-9a2b-8d88944b4b95" (UID: "d5613941-d5be-46e8-9a2b-8d88944b4b95"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:12.009811 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.009647 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:28:12.009811 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.009670 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:28:12.011772 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.011741 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5613941-d5be-46e8-9a2b-8d88944b4b95-kube-api-access-mtrnr" (OuterVolumeSpecName: "kube-api-access-mtrnr") pod "d5613941-d5be-46e8-9a2b-8d88944b4b95" (UID: "d5613941-d5be-46e8-9a2b-8d88944b4b95"). InnerVolumeSpecName "kube-api-access-mtrnr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:28:12.012406 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.012303 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-dshm" (OuterVolumeSpecName: "dshm") pod "d5613941-d5be-46e8-9a2b-8d88944b4b95" (UID: "d5613941-d5be-46e8-9a2b-8d88944b4b95"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:12.012587 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.012452 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5613941-d5be-46e8-9a2b-8d88944b4b95-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d5613941-d5be-46e8-9a2b-8d88944b4b95" (UID: "d5613941-d5be-46e8-9a2b-8d88944b4b95"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:28:12.076501 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.076424 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d5613941-d5be-46e8-9a2b-8d88944b4b95" (UID: "d5613941-d5be-46e8-9a2b-8d88944b4b95"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:12.110528 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.110416 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mtrnr\" (UniqueName: \"kubernetes.io/projected/d5613941-d5be-46e8-9a2b-8d88944b4b95-kube-api-access-mtrnr\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:28:12.110528 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.110462 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5613941-d5be-46e8-9a2b-8d88944b4b95-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:28:12.110528 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.110480 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:28:12.110528 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.110493 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5613941-d5be-46e8-9a2b-8d88944b4b95-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:28:12.509794 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.509761 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-c84d7687-b4pvv_d5613941-d5be-46e8-9a2b-8d88944b4b95/main/0.log" Apr 16 18:28:12.510161 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.510133 2574 generic.go:358] "Generic (PLEG): container finished" podID="d5613941-d5be-46e8-9a2b-8d88944b4b95" containerID="5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9" exitCode=137 Apr 16 18:28:12.510273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.510189 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" event={"ID":"d5613941-d5be-46e8-9a2b-8d88944b4b95","Type":"ContainerDied","Data":"5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9"} Apr 16 18:28:12.510273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.510202 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" Apr 16 18:28:12.510273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.510224 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv" event={"ID":"d5613941-d5be-46e8-9a2b-8d88944b4b95","Type":"ContainerDied","Data":"ace73e02b27409f95cd105dc60f1dca90a0f15ef3241778c3c2629e5499f516a"} Apr 16 18:28:12.510273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.510246 2574 scope.go:117] "RemoveContainer" containerID="5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9" Apr 16 18:28:12.521583 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.521554 2574 scope.go:117] "RemoveContainer" containerID="7ac082d0e6e472bcd31d39d72fe6d8f443f04cc741e1411b84d66f0a7ecc4061" Apr 16 18:28:12.535170 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.535134 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv"] Apr 16 18:28:12.536704 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.536677 2574 scope.go:117] "RemoveContainer" containerID="5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9" Apr 16 18:28:12.537120 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:28:12.537085 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9\": container with ID starting with 5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9 not found: ID does not exist" containerID="5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9" Apr 16 18:28:12.537249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.537128 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9"} err="failed to get container status \"5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9\": rpc error: code = NotFound desc = could not find container \"5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9\": container with ID starting with 5683628d1cc6c79cdbad10f6eab0b67b985cf6de1e4c964ad31e0ed1043688c9 not found: ID does not exist" Apr 16 18:28:12.537249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.537154 2574 scope.go:117] "RemoveContainer" containerID="7ac082d0e6e472bcd31d39d72fe6d8f443f04cc741e1411b84d66f0a7ecc4061" Apr 16 18:28:12.538024 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:28:12.537982 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ac082d0e6e472bcd31d39d72fe6d8f443f04cc741e1411b84d66f0a7ecc4061\": container with ID starting with 7ac082d0e6e472bcd31d39d72fe6d8f443f04cc741e1411b84d66f0a7ecc4061 not found: ID does not exist" containerID="7ac082d0e6e472bcd31d39d72fe6d8f443f04cc741e1411b84d66f0a7ecc4061" Apr 16 18:28:12.538175 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.538151 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ac082d0e6e472bcd31d39d72fe6d8f443f04cc741e1411b84d66f0a7ecc4061"} err="failed to get container status \"7ac082d0e6e472bcd31d39d72fe6d8f443f04cc741e1411b84d66f0a7ecc4061\": rpc error: code = NotFound desc = could not find container \"7ac082d0e6e472bcd31d39d72fe6d8f443f04cc741e1411b84d66f0a7ecc4061\": container with ID starting with 7ac082d0e6e472bcd31d39d72fe6d8f443f04cc741e1411b84d66f0a7ecc4061 not found: ID does not exist" Apr 16 18:28:12.540175 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.540149 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-c84d7687-b4pvv"] Apr 16 18:28:12.938032 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:12.938000 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5613941-d5be-46e8-9a2b-8d88944b4b95" path="/var/lib/kubelet/pods/d5613941-d5be-46e8-9a2b-8d88944b4b95/volumes" Apr 16 18:28:13.514563 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:13.514530 2574 generic.go:358] "Generic (PLEG): container finished" podID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerID="5d6ea4601f2c87e22e6794c863dc190f076286cf28225309574edcbc3f9dff87" exitCode=0 Apr 16 18:28:13.514859 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:13.514601 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" event={"ID":"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd","Type":"ContainerDied","Data":"5d6ea4601f2c87e22e6794c863dc190f076286cf28225309574edcbc3f9dff87"} Apr 16 18:28:14.520582 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:14.520551 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" event={"ID":"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd","Type":"ContainerStarted","Data":"c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07"} Apr 16 18:28:14.544449 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:14.544386 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podStartSLOduration=6.544367922 podStartE2EDuration="6.544367922s" podCreationTimestamp="2026-04-16 18:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:28:14.542777045 +0000 UTC m=+1114.198011898" watchObservedRunningTime="2026-04-16 18:28:14.544367922 +0000 UTC m=+1114.199602775" Apr 16 18:28:18.422995 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:18.422951 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:18.423454 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:18.423015 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:28:18.424706 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:18.424666 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 18:28:18.647703 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:18.647654 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 18:28:26.405175 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.405143 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg_8bd89b8d-61eb-4073-9afb-73a4f5678ccf/main/0.log" Apr 16 18:28:26.405609 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.405531 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:28:26.438247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.438211 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-kserve-provision-location\") pod \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " Apr 16 18:28:26.438247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.438265 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-model-cache\") pod \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " Apr 16 18:28:26.438588 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.438303 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-tls-certs\") pod \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " Apr 16 18:28:26.438588 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.438332 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x88fz\" (UniqueName: \"kubernetes.io/projected/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-kube-api-access-x88fz\") pod \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " Apr 16 18:28:26.438588 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.438369 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-dshm\") pod \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " Apr 16 18:28:26.438588 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.438414 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-home\") pod \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\" (UID: \"8bd89b8d-61eb-4073-9afb-73a4f5678ccf\") " Apr 16 18:28:26.438588 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.438556 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-model-cache" (OuterVolumeSpecName: "model-cache") pod "8bd89b8d-61eb-4073-9afb-73a4f5678ccf" (UID: "8bd89b8d-61eb-4073-9afb-73a4f5678ccf"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:26.438871 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.438699 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:28:26.438936 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.438885 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-home" (OuterVolumeSpecName: "home") pod "8bd89b8d-61eb-4073-9afb-73a4f5678ccf" (UID: "8bd89b8d-61eb-4073-9afb-73a4f5678ccf"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:26.441212 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.441180 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-kube-api-access-x88fz" (OuterVolumeSpecName: "kube-api-access-x88fz") pod "8bd89b8d-61eb-4073-9afb-73a4f5678ccf" (UID: "8bd89b8d-61eb-4073-9afb-73a4f5678ccf"). InnerVolumeSpecName "kube-api-access-x88fz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:28:26.441707 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.441672 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8bd89b8d-61eb-4073-9afb-73a4f5678ccf" (UID: "8bd89b8d-61eb-4073-9afb-73a4f5678ccf"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:28:26.441865 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.441784 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-dshm" (OuterVolumeSpecName: "dshm") pod "8bd89b8d-61eb-4073-9afb-73a4f5678ccf" (UID: "8bd89b8d-61eb-4073-9afb-73a4f5678ccf"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:26.497782 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.497725 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8bd89b8d-61eb-4073-9afb-73a4f5678ccf" (UID: "8bd89b8d-61eb-4073-9afb-73a4f5678ccf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:26.539662 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.539624 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:28:26.539662 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.539665 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:28:26.539930 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.539681 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:28:26.539930 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.539695 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:28:26.539930 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.539712 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x88fz\" (UniqueName: \"kubernetes.io/projected/8bd89b8d-61eb-4073-9afb-73a4f5678ccf-kube-api-access-x88fz\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:28:26.563737 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.563710 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg_8bd89b8d-61eb-4073-9afb-73a4f5678ccf/main/0.log" Apr 16 18:28:26.564141 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.564112 2574 generic.go:358] "Generic (PLEG): container finished" podID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerID="c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7" exitCode=137 Apr 16 18:28:26.564216 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.564181 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" Apr 16 18:28:26.564277 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.564199 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" event={"ID":"8bd89b8d-61eb-4073-9afb-73a4f5678ccf","Type":"ContainerDied","Data":"c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7"} Apr 16 18:28:26.564277 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.564253 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg" event={"ID":"8bd89b8d-61eb-4073-9afb-73a4f5678ccf","Type":"ContainerDied","Data":"30e423ce3c12057665e562510515b6113c66fef8467d41382f61a71a25fac8c1"} Apr 16 18:28:26.564385 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.564275 2574 scope.go:117] "RemoveContainer" containerID="c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7" Apr 16 18:28:26.585512 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.585489 2574 scope.go:117] "RemoveContainer" containerID="44bb16ecf7f3ae635032af13a766000a782a820bdfb3944e38f1d669ce401dd5" Apr 16 18:28:26.589545 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.589512 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg"] Apr 16 18:28:26.593482 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.593441 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-658c45c6b45thmg"] Apr 16 18:28:26.661421 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.661395 2574 scope.go:117] "RemoveContainer" containerID="c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7" Apr 16 18:28:26.661836 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:28:26.661796 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7\": container with ID starting with c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7 not found: ID does not exist" containerID="c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7" Apr 16 18:28:26.661983 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.661957 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7"} err="failed to get container status \"c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7\": rpc error: code = NotFound desc = could not find container \"c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7\": container with ID starting with c456595ddfc13d1651d915df16b90faf4a2a2e98e3127d2b3ed5b4d5bea202f7 not found: ID does not exist" Apr 16 18:28:26.662059 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.661987 2574 scope.go:117] "RemoveContainer" containerID="44bb16ecf7f3ae635032af13a766000a782a820bdfb3944e38f1d669ce401dd5" Apr 16 18:28:26.662329 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:28:26.662301 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44bb16ecf7f3ae635032af13a766000a782a820bdfb3944e38f1d669ce401dd5\": container with ID starting with 44bb16ecf7f3ae635032af13a766000a782a820bdfb3944e38f1d669ce401dd5 not found: ID does not exist" containerID="44bb16ecf7f3ae635032af13a766000a782a820bdfb3944e38f1d669ce401dd5" Apr 16 18:28:26.662400 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.662340 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44bb16ecf7f3ae635032af13a766000a782a820bdfb3944e38f1d669ce401dd5"} err="failed to get container status \"44bb16ecf7f3ae635032af13a766000a782a820bdfb3944e38f1d669ce401dd5\": rpc error: code = NotFound desc = could not find container \"44bb16ecf7f3ae635032af13a766000a782a820bdfb3944e38f1d669ce401dd5\": container with ID starting with 44bb16ecf7f3ae635032af13a766000a782a820bdfb3944e38f1d669ce401dd5 not found: ID does not exist" Apr 16 18:28:26.939269 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:26.939180 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" path="/var/lib/kubelet/pods/8bd89b8d-61eb-4073-9afb-73a4f5678ccf/volumes" Apr 16 18:28:28.423429 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:28.423375 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 18:28:28.647628 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:28.647579 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 18:28:38.423499 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:38.423442 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 18:28:38.647427 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:38.647366 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 18:28:48.422987 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:48.422940 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 18:28:48.647628 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:48.647584 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 18:28:58.423383 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:58.423320 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 18:28:58.647835 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:28:58.647772 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 18:29:08.422902 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:08.422857 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 18:29:08.647727 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:08.647675 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 18:29:18.422909 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:18.422859 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 18:29:18.647184 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:18.647138 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 18:29:28.423359 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:28.423303 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 18:29:28.647550 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:28.647504 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 18:29:38.423046 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:38.423004 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 18:29:38.647541 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:38.647500 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 16 18:29:40.887947 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:40.887911 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:29:40.888786 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:40.888766 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:29:40.893270 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:40.893246 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:29:40.894698 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:40.894675 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:29:48.422856 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:48.422786 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 18:29:48.657112 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:48.657066 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:29:48.666024 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:48.665992 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:29:50.472245 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:50.472193 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd"] Apr 16 18:29:50.472884 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:50.472500 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" containerID="cri-o://8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0" gracePeriod=30 Apr 16 18:29:58.422996 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:29:58.422950 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 18:30:08.422691 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:08.422646 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 16 18:30:09.408809 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.408762 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7"] Apr 16 18:30:09.409093 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.409081 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5613941-d5be-46e8-9a2b-8d88944b4b95" containerName="main" Apr 16 18:30:09.409137 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.409095 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5613941-d5be-46e8-9a2b-8d88944b4b95" containerName="main" Apr 16 18:30:09.409137 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.409103 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5613941-d5be-46e8-9a2b-8d88944b4b95" containerName="storage-initializer" Apr 16 18:30:09.409137 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.409108 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5613941-d5be-46e8-9a2b-8d88944b4b95" containerName="storage-initializer" Apr 16 18:30:09.409137 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.409121 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="storage-initializer" Apr 16 18:30:09.409137 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.409127 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="storage-initializer" Apr 16 18:30:09.409292 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.409141 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" Apr 16 18:30:09.409292 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.409147 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" Apr 16 18:30:09.409292 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.409193 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5613941-d5be-46e8-9a2b-8d88944b4b95" containerName="main" Apr 16 18:30:09.409292 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.409201 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bd89b8d-61eb-4073-9afb-73a4f5678ccf" containerName="main" Apr 16 18:30:09.413129 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.413110 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.414104 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.414083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-dshm\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.414163 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.414120 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-home\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.414163 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.414152 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-kserve-provision-location\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.414244 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.414171 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a5183c81-a13d-4da1-a55e-a1174b697e5c-tls-certs\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.414244 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.414204 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xljwf\" (UniqueName: \"kubernetes.io/projected/a5183c81-a13d-4da1-a55e-a1174b697e5c-kube-api-access-xljwf\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.414352 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.414311 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-model-cache\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.429641 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.429607 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7"] Apr 16 18:30:09.515163 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.515122 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-dshm\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.515163 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.515166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-home\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.515419 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.515188 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-kserve-provision-location\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.515419 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.515208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a5183c81-a13d-4da1-a55e-a1174b697e5c-tls-certs\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.515419 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.515242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xljwf\" (UniqueName: \"kubernetes.io/projected/a5183c81-a13d-4da1-a55e-a1174b697e5c-kube-api-access-xljwf\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.515419 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.515274 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-model-cache\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.515651 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.515626 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-home\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.515714 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.515688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-kserve-provision-location\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.515756 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.515727 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-model-cache\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.517555 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.517534 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-dshm\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.517735 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.517717 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a5183c81-a13d-4da1-a55e-a1174b697e5c-tls-certs\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.524143 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.524115 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xljwf\" (UniqueName: \"kubernetes.io/projected/a5183c81-a13d-4da1-a55e-a1174b697e5c-kube-api-access-xljwf\") pod \"stop-feature-test-kserve-7fb68447c8-z6cf7\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.724750 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.724653 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:09.857063 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.857016 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7"] Apr 16 18:30:09.860206 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:30:09.860170 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5183c81_a13d_4da1_a55e_a1174b697e5c.slice/crio-6372994f1444062f30ac1b6528d6dc01bc95438a4b06af662a7790c09ad904f2 WatchSource:0}: Error finding container 6372994f1444062f30ac1b6528d6dc01bc95438a4b06af662a7790c09ad904f2: Status 404 returned error can't find the container with id 6372994f1444062f30ac1b6528d6dc01bc95438a4b06af662a7790c09ad904f2 Apr 16 18:30:09.861916 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.861896 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:30:09.920486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:09.920451 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" event={"ID":"a5183c81-a13d-4da1-a55e-a1174b697e5c","Type":"ContainerStarted","Data":"6372994f1444062f30ac1b6528d6dc01bc95438a4b06af662a7790c09ad904f2"} Apr 16 18:30:10.926783 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:10.926741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" event={"ID":"a5183c81-a13d-4da1-a55e-a1174b697e5c","Type":"ContainerStarted","Data":"05a5939561e8a1dc5f316cca3a33cb4a2b30082b5c87e8d03b69c0edf6447752"} Apr 16 18:30:14.940385 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:14.940349 2574 generic.go:358] "Generic (PLEG): container finished" podID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerID="05a5939561e8a1dc5f316cca3a33cb4a2b30082b5c87e8d03b69c0edf6447752" exitCode=0 Apr 16 18:30:14.940772 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:14.940425 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" event={"ID":"a5183c81-a13d-4da1-a55e-a1174b697e5c","Type":"ContainerDied","Data":"05a5939561e8a1dc5f316cca3a33cb4a2b30082b5c87e8d03b69c0edf6447752"} Apr 16 18:30:15.945200 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:15.945163 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" event={"ID":"a5183c81-a13d-4da1-a55e-a1174b697e5c","Type":"ContainerStarted","Data":"15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6"} Apr 16 18:30:15.971701 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:15.971649 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" podStartSLOduration=6.971632635 podStartE2EDuration="6.971632635s" podCreationTimestamp="2026-04-16 18:30:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:15.969885176 +0000 UTC m=+1235.625120067" watchObservedRunningTime="2026-04-16 18:30:15.971632635 +0000 UTC m=+1235.626867488" Apr 16 18:30:18.437596 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:18.437566 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:30:18.446452 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:18.446417 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:30:19.725215 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:19.725172 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:19.725215 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:19.725226 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:30:19.727212 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:19.727179 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 18:30:20.731143 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.731076 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-7fb68447c8-688vd_8771fa29-9e98-4225-bc04-bd5621501a57/main/0.log" Apr 16 18:30:20.731499 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.731479 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:30:20.822821 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.822780 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lncb\" (UniqueName: \"kubernetes.io/projected/8771fa29-9e98-4225-bc04-bd5621501a57-kube-api-access-2lncb\") pod \"8771fa29-9e98-4225-bc04-bd5621501a57\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " Apr 16 18:30:20.823037 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.822882 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-kserve-provision-location\") pod \"8771fa29-9e98-4225-bc04-bd5621501a57\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " Apr 16 18:30:20.823037 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.822905 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-model-cache\") pod \"8771fa29-9e98-4225-bc04-bd5621501a57\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " Apr 16 18:30:20.823037 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.822948 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-home\") pod \"8771fa29-9e98-4225-bc04-bd5621501a57\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " Apr 16 18:30:20.823037 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.823004 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-dshm\") pod \"8771fa29-9e98-4225-bc04-bd5621501a57\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " Apr 16 18:30:20.823253 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.823045 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8771fa29-9e98-4225-bc04-bd5621501a57-tls-certs\") pod \"8771fa29-9e98-4225-bc04-bd5621501a57\" (UID: \"8771fa29-9e98-4225-bc04-bd5621501a57\") " Apr 16 18:30:20.823253 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.823202 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-model-cache" (OuterVolumeSpecName: "model-cache") pod "8771fa29-9e98-4225-bc04-bd5621501a57" (UID: "8771fa29-9e98-4225-bc04-bd5621501a57"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:30:20.823353 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.823325 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:30:20.823399 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.823347 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-home" (OuterVolumeSpecName: "home") pod "8771fa29-9e98-4225-bc04-bd5621501a57" (UID: "8771fa29-9e98-4225-bc04-bd5621501a57"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:30:20.825444 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.825411 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-dshm" (OuterVolumeSpecName: "dshm") pod "8771fa29-9e98-4225-bc04-bd5621501a57" (UID: "8771fa29-9e98-4225-bc04-bd5621501a57"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:30:20.826012 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.825981 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8771fa29-9e98-4225-bc04-bd5621501a57-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8771fa29-9e98-4225-bc04-bd5621501a57" (UID: "8771fa29-9e98-4225-bc04-bd5621501a57"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:30:20.826243 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.826216 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8771fa29-9e98-4225-bc04-bd5621501a57-kube-api-access-2lncb" (OuterVolumeSpecName: "kube-api-access-2lncb") pod "8771fa29-9e98-4225-bc04-bd5621501a57" (UID: "8771fa29-9e98-4225-bc04-bd5621501a57"). InnerVolumeSpecName "kube-api-access-2lncb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:30:20.894438 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.894385 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8771fa29-9e98-4225-bc04-bd5621501a57" (UID: "8771fa29-9e98-4225-bc04-bd5621501a57"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:30:20.924467 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.924426 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:30:20.924467 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.924468 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8771fa29-9e98-4225-bc04-bd5621501a57-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:30:20.924732 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.924485 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2lncb\" (UniqueName: \"kubernetes.io/projected/8771fa29-9e98-4225-bc04-bd5621501a57-kube-api-access-2lncb\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:30:20.924732 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.924500 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:30:20.924732 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.924515 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8771fa29-9e98-4225-bc04-bd5621501a57-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:30:20.962882 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.962844 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-7fb68447c8-688vd_8771fa29-9e98-4225-bc04-bd5621501a57/main/0.log" Apr 16 18:30:20.963226 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.963198 2574 generic.go:358] "Generic (PLEG): container finished" podID="8771fa29-9e98-4225-bc04-bd5621501a57" containerID="8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0" exitCode=137 Apr 16 18:30:20.963343 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.963284 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" Apr 16 18:30:20.963405 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.963286 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" event={"ID":"8771fa29-9e98-4225-bc04-bd5621501a57","Type":"ContainerDied","Data":"8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0"} Apr 16 18:30:20.963523 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.963411 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd" event={"ID":"8771fa29-9e98-4225-bc04-bd5621501a57","Type":"ContainerDied","Data":"e16dc805729d1257d64f19e4c13c20d1c498b3970d8f17ca0013aebcba3389fb"} Apr 16 18:30:20.963523 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.963428 2574 scope.go:117] "RemoveContainer" containerID="8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0" Apr 16 18:30:20.973145 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.973113 2574 scope.go:117] "RemoveContainer" containerID="52ed64967e895443efa80e30445716ab3e434f2b8e6d8093d29b5e2ecc48178d" Apr 16 18:30:20.987025 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.986933 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd"] Apr 16 18:30:20.992904 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:20.992860 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-688vd"] Apr 16 18:30:21.008922 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:21.008896 2574 scope.go:117] "RemoveContainer" containerID="8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0" Apr 16 18:30:21.009418 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:30:21.009390 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0\": container with ID starting with 8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0 not found: ID does not exist" containerID="8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0" Apr 16 18:30:21.009511 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:21.009428 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0"} err="failed to get container status \"8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0\": rpc error: code = NotFound desc = could not find container \"8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0\": container with ID starting with 8ef441f6d683125da2977d61ddfbcd06da67e6a8517acbe2d66c9aecf95f00e0 not found: ID does not exist" Apr 16 18:30:21.009511 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:21.009450 2574 scope.go:117] "RemoveContainer" containerID="52ed64967e895443efa80e30445716ab3e434f2b8e6d8093d29b5e2ecc48178d" Apr 16 18:30:21.009770 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:30:21.009747 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ed64967e895443efa80e30445716ab3e434f2b8e6d8093d29b5e2ecc48178d\": container with ID starting with 52ed64967e895443efa80e30445716ab3e434f2b8e6d8093d29b5e2ecc48178d not found: ID does not exist" containerID="52ed64967e895443efa80e30445716ab3e434f2b8e6d8093d29b5e2ecc48178d" Apr 16 18:30:21.009844 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:21.009778 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ed64967e895443efa80e30445716ab3e434f2b8e6d8093d29b5e2ecc48178d"} err="failed to get container status \"52ed64967e895443efa80e30445716ab3e434f2b8e6d8093d29b5e2ecc48178d\": rpc error: code = NotFound desc = could not find container \"52ed64967e895443efa80e30445716ab3e434f2b8e6d8093d29b5e2ecc48178d\": container with ID starting with 52ed64967e895443efa80e30445716ab3e434f2b8e6d8093d29b5e2ecc48178d not found: ID does not exist" Apr 16 18:30:22.939700 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:22.939661 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" path="/var/lib/kubelet/pods/8771fa29-9e98-4225-bc04-bd5621501a57/volumes" Apr 16 18:30:29.725546 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:29.725496 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 18:30:36.390385 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:36.390345 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964"] Apr 16 18:30:36.390810 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:36.390639 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" containerID="cri-o://c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07" gracePeriod=30 Apr 16 18:30:39.726077 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:39.726020 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 18:30:49.725734 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:49.725683 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 18:30:59.725638 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:30:59.725587 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 18:31:03.124496 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.124456 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl"] Apr 16 18:31:03.124939 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.124920 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="storage-initializer" Apr 16 18:31:03.125018 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.124943 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="storage-initializer" Apr 16 18:31:03.125018 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.124966 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" Apr 16 18:31:03.125018 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.124975 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" Apr 16 18:31:03.125164 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.125072 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8771fa29-9e98-4225-bc04-bd5621501a57" containerName="main" Apr 16 18:31:03.128462 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.128432 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.131331 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.131301 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 18:31:03.142849 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.142791 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl"] Apr 16 18:31:03.210418 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.210383 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-kserve-provision-location\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.210418 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.210421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcgp8\" (UniqueName: \"kubernetes.io/projected/c48175cf-0af6-418c-ba31-396e10369610-kube-api-access-gcgp8\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.210680 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.210496 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-model-cache\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.210680 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.210532 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c48175cf-0af6-418c-ba31-396e10369610-tls-certs\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.210680 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.210613 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-dshm\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.210680 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.210640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-home\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.311233 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.311191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-model-cache\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.311233 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.311239 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c48175cf-0af6-418c-ba31-396e10369610-tls-certs\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.311482 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.311286 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-dshm\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.311482 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.311316 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-home\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.311482 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.311393 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-kserve-provision-location\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.311482 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.311426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcgp8\" (UniqueName: \"kubernetes.io/projected/c48175cf-0af6-418c-ba31-396e10369610-kube-api-access-gcgp8\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.311692 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.311668 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-model-cache\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.311842 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.311795 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-home\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.311950 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.311849 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-kserve-provision-location\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.313567 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.313540 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-dshm\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.314080 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.314057 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c48175cf-0af6-418c-ba31-396e10369610-tls-certs\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.322228 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.322197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcgp8\" (UniqueName: \"kubernetes.io/projected/c48175cf-0af6-418c-ba31-396e10369610-kube-api-access-gcgp8\") pod \"router-with-refs-test-kserve-b85596bcd-zghwl\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.439154 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.439113 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:03.582565 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:03.582532 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl"] Apr 16 18:31:03.584803 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:31:03.584755 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc48175cf_0af6_418c_ba31_396e10369610.slice/crio-3ff86c6d334e988243a13f5d13a65c171006e77a487aba3b85dff16f6cd92e90 WatchSource:0}: Error finding container 3ff86c6d334e988243a13f5d13a65c171006e77a487aba3b85dff16f6cd92e90: Status 404 returned error can't find the container with id 3ff86c6d334e988243a13f5d13a65c171006e77a487aba3b85dff16f6cd92e90 Apr 16 18:31:04.132517 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:04.132471 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" event={"ID":"c48175cf-0af6-418c-ba31-396e10369610","Type":"ContainerStarted","Data":"e549d6abd4cbf92b6f3a07230d0dff61231f2c8944b9805b9a2e2ec2c5006661"} Apr 16 18:31:04.132517 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:04.132518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" event={"ID":"c48175cf-0af6-418c-ba31-396e10369610","Type":"ContainerStarted","Data":"3ff86c6d334e988243a13f5d13a65c171006e77a487aba3b85dff16f6cd92e90"} Apr 16 18:31:06.678234 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.678208 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-5fc8467cf8-pf964_6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd/main/0.log" Apr 16 18:31:06.678634 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.678617 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:31:06.743620 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.743570 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-tls-certs\") pod \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " Apr 16 18:31:06.743620 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.743608 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-kserve-provision-location\") pod \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " Apr 16 18:31:06.743915 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.743646 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-home\") pod \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " Apr 16 18:31:06.743915 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.743668 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-dshm\") pod \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " Apr 16 18:31:06.743915 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.743693 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-model-cache\") pod \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " Apr 16 18:31:06.743915 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.743718 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bj79\" (UniqueName: \"kubernetes.io/projected/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-kube-api-access-6bj79\") pod \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\" (UID: \"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd\") " Apr 16 18:31:06.744130 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.744032 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-home" (OuterVolumeSpecName: "home") pod "6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" (UID: "6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:31:06.744216 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.744187 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-model-cache" (OuterVolumeSpecName: "model-cache") pod "6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" (UID: "6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:31:06.745980 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.745946 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" (UID: "6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:31:06.746518 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.746495 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-kube-api-access-6bj79" (OuterVolumeSpecName: "kube-api-access-6bj79") pod "6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" (UID: "6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd"). InnerVolumeSpecName "kube-api-access-6bj79". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:31:06.746597 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.746497 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-dshm" (OuterVolumeSpecName: "dshm") pod "6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" (UID: "6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:31:06.819341 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.819288 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" (UID: "6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:31:06.844730 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.844677 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:31:06.844730 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.844712 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:31:06.844730 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.844721 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:31:06.844730 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.844730 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:31:06.844730 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.844738 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:31:06.844730 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:06.844747 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6bj79\" (UniqueName: \"kubernetes.io/projected/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd-kube-api-access-6bj79\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:31:07.146963 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.146867 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-5fc8467cf8-pf964_6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd/main/0.log" Apr 16 18:31:07.147651 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.147588 2574 generic.go:358] "Generic (PLEG): container finished" podID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerID="c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07" exitCode=137 Apr 16 18:31:07.147651 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.147663 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" event={"ID":"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd","Type":"ContainerDied","Data":"c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07"} Apr 16 18:31:07.147651 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.147687 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" Apr 16 18:31:07.148066 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.147719 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964" event={"ID":"6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd","Type":"ContainerDied","Data":"d93c223156d50287f7637dbc1794afea922a3329442638e184c2e9055226deef"} Apr 16 18:31:07.148066 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.147744 2574 scope.go:117] "RemoveContainer" containerID="c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07" Apr 16 18:31:07.176191 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.176124 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964"] Apr 16 18:31:07.182305 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.182275 2574 scope.go:117] "RemoveContainer" containerID="5d6ea4601f2c87e22e6794c863dc190f076286cf28225309574edcbc3f9dff87" Apr 16 18:31:07.190018 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.189986 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5fc8467cf8-pf964"] Apr 16 18:31:07.261509 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.261480 2574 scope.go:117] "RemoveContainer" containerID="c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07" Apr 16 18:31:07.261950 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:31:07.261922 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07\": container with ID starting with c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07 not found: ID does not exist" containerID="c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07" Apr 16 18:31:07.262045 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.261961 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07"} err="failed to get container status \"c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07\": rpc error: code = NotFound desc = could not find container \"c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07\": container with ID starting with c604059ff661182517bbccb56c250a42aba5828c36ff7629de594817e4462c07 not found: ID does not exist" Apr 16 18:31:07.262045 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.261981 2574 scope.go:117] "RemoveContainer" containerID="5d6ea4601f2c87e22e6794c863dc190f076286cf28225309574edcbc3f9dff87" Apr 16 18:31:07.262445 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:31:07.262425 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6ea4601f2c87e22e6794c863dc190f076286cf28225309574edcbc3f9dff87\": container with ID starting with 5d6ea4601f2c87e22e6794c863dc190f076286cf28225309574edcbc3f9dff87 not found: ID does not exist" containerID="5d6ea4601f2c87e22e6794c863dc190f076286cf28225309574edcbc3f9dff87" Apr 16 18:31:07.262518 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:07.262450 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6ea4601f2c87e22e6794c863dc190f076286cf28225309574edcbc3f9dff87"} err="failed to get container status \"5d6ea4601f2c87e22e6794c863dc190f076286cf28225309574edcbc3f9dff87\": rpc error: code = NotFound desc = could not find container \"5d6ea4601f2c87e22e6794c863dc190f076286cf28225309574edcbc3f9dff87\": container with ID starting with 5d6ea4601f2c87e22e6794c863dc190f076286cf28225309574edcbc3f9dff87 not found: ID does not exist" Apr 16 18:31:08.153710 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:08.153617 2574 generic.go:358] "Generic (PLEG): container finished" podID="c48175cf-0af6-418c-ba31-396e10369610" containerID="e549d6abd4cbf92b6f3a07230d0dff61231f2c8944b9805b9a2e2ec2c5006661" exitCode=0 Apr 16 18:31:08.153710 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:08.153692 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" event={"ID":"c48175cf-0af6-418c-ba31-396e10369610","Type":"ContainerDied","Data":"e549d6abd4cbf92b6f3a07230d0dff61231f2c8944b9805b9a2e2ec2c5006661"} Apr 16 18:31:08.940193 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:08.940156 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" path="/var/lib/kubelet/pods/6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd/volumes" Apr 16 18:31:09.160803 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:09.160768 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" event={"ID":"c48175cf-0af6-418c-ba31-396e10369610","Type":"ContainerStarted","Data":"2d83113aa14465b2242299c74aece19c3688f7cbe26518b94f7f815e136f5875"} Apr 16 18:31:09.210523 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:09.210402 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" podStartSLOduration=6.210381152 podStartE2EDuration="6.210381152s" podCreationTimestamp="2026-04-16 18:31:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:31:09.208622108 +0000 UTC m=+1288.863856962" watchObservedRunningTime="2026-04-16 18:31:09.210381152 +0000 UTC m=+1288.865616005" Apr 16 18:31:09.725228 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:09.725174 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 18:31:13.440075 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:13.440035 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:13.440075 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:13.440088 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:31:13.441682 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:13.441654 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 18:31:19.726173 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:19.726102 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 18:31:23.440035 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:23.439982 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 18:31:29.725981 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:29.725932 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 18:31:33.440560 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:33.440506 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 18:31:39.725550 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:39.725491 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 18:31:43.440587 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:43.440521 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 18:31:49.725985 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:49.725932 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 16 18:31:53.440418 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:53.440321 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 18:31:59.735276 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:59.735237 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:31:59.744456 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:31:59.744423 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:32:01.245972 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:01.245925 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7"] Apr 16 18:32:01.354178 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:01.354137 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" containerID="cri-o://15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6" gracePeriod=30 Apr 16 18:32:03.440018 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:03.439968 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 18:32:13.439838 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:13.439784 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 18:32:23.439680 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:23.439635 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 18:32:31.640435 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.640409 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-7fb68447c8-z6cf7_a5183c81-a13d-4da1-a55e-a1174b697e5c/main/0.log" Apr 16 18:32:31.640881 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.640807 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:32:31.704022 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.703983 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xljwf\" (UniqueName: \"kubernetes.io/projected/a5183c81-a13d-4da1-a55e-a1174b697e5c-kube-api-access-xljwf\") pod \"a5183c81-a13d-4da1-a55e-a1174b697e5c\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " Apr 16 18:32:31.704218 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.704038 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-dshm\") pod \"a5183c81-a13d-4da1-a55e-a1174b697e5c\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " Apr 16 18:32:31.704218 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.704075 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a5183c81-a13d-4da1-a55e-a1174b697e5c-tls-certs\") pod \"a5183c81-a13d-4da1-a55e-a1174b697e5c\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " Apr 16 18:32:31.704218 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.704112 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-home\") pod \"a5183c81-a13d-4da1-a55e-a1174b697e5c\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " Apr 16 18:32:31.704218 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.704157 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-kserve-provision-location\") pod \"a5183c81-a13d-4da1-a55e-a1174b697e5c\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " Apr 16 18:32:31.704218 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.704190 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-model-cache\") pod \"a5183c81-a13d-4da1-a55e-a1174b697e5c\" (UID: \"a5183c81-a13d-4da1-a55e-a1174b697e5c\") " Apr 16 18:32:31.704691 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.704652 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-model-cache" (OuterVolumeSpecName: "model-cache") pod "a5183c81-a13d-4da1-a55e-a1174b697e5c" (UID: "a5183c81-a13d-4da1-a55e-a1174b697e5c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:32:31.704810 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.704726 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-home" (OuterVolumeSpecName: "home") pod "a5183c81-a13d-4da1-a55e-a1174b697e5c" (UID: "a5183c81-a13d-4da1-a55e-a1174b697e5c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:32:31.706766 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.706724 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5183c81-a13d-4da1-a55e-a1174b697e5c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a5183c81-a13d-4da1-a55e-a1174b697e5c" (UID: "a5183c81-a13d-4da1-a55e-a1174b697e5c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:32:31.706881 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.706818 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-dshm" (OuterVolumeSpecName: "dshm") pod "a5183c81-a13d-4da1-a55e-a1174b697e5c" (UID: "a5183c81-a13d-4da1-a55e-a1174b697e5c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:32:31.706881 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.706848 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5183c81-a13d-4da1-a55e-a1174b697e5c-kube-api-access-xljwf" (OuterVolumeSpecName: "kube-api-access-xljwf") pod "a5183c81-a13d-4da1-a55e-a1174b697e5c" (UID: "a5183c81-a13d-4da1-a55e-a1174b697e5c"). InnerVolumeSpecName "kube-api-access-xljwf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:32:31.760613 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.760552 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a5183c81-a13d-4da1-a55e-a1174b697e5c" (UID: "a5183c81-a13d-4da1-a55e-a1174b697e5c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:32:31.805847 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.805788 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xljwf\" (UniqueName: \"kubernetes.io/projected/a5183c81-a13d-4da1-a55e-a1174b697e5c-kube-api-access-xljwf\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:32:31.805847 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.805820 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:32:31.805847 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.805855 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a5183c81-a13d-4da1-a55e-a1174b697e5c-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:32:31.806132 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.805864 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:32:31.806132 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.805873 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:32:31.806132 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:31.805882 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a5183c81-a13d-4da1-a55e-a1174b697e5c-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:32:32.457685 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.457652 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-7fb68447c8-z6cf7_a5183c81-a13d-4da1-a55e-a1174b697e5c/main/0.log" Apr 16 18:32:32.458013 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.457983 2574 generic.go:358] "Generic (PLEG): container finished" podID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerID="15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6" exitCode=137 Apr 16 18:32:32.458128 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.458060 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" Apr 16 18:32:32.458128 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.458072 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" event={"ID":"a5183c81-a13d-4da1-a55e-a1174b697e5c","Type":"ContainerDied","Data":"15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6"} Apr 16 18:32:32.458128 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.458113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7" event={"ID":"a5183c81-a13d-4da1-a55e-a1174b697e5c","Type":"ContainerDied","Data":"6372994f1444062f30ac1b6528d6dc01bc95438a4b06af662a7790c09ad904f2"} Apr 16 18:32:32.458128 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.458128 2574 scope.go:117] "RemoveContainer" containerID="15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6" Apr 16 18:32:32.482222 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.482189 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7"] Apr 16 18:32:32.484629 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.484606 2574 scope.go:117] "RemoveContainer" containerID="05a5939561e8a1dc5f316cca3a33cb4a2b30082b5c87e8d03b69c0edf6447752" Apr 16 18:32:32.488402 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.488376 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7fb68447c8-z6cf7"] Apr 16 18:32:32.547632 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.547607 2574 scope.go:117] "RemoveContainer" containerID="15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6" Apr 16 18:32:32.548001 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:32:32.547978 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6\": container with ID starting with 15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6 not found: ID does not exist" containerID="15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6" Apr 16 18:32:32.548118 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.548010 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6"} err="failed to get container status \"15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6\": rpc error: code = NotFound desc = could not find container \"15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6\": container with ID starting with 15be2241f38187f9d56fd9170d8f47472368b848cf52c22e8e48d29152c0c3c6 not found: ID does not exist" Apr 16 18:32:32.548118 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.548032 2574 scope.go:117] "RemoveContainer" containerID="05a5939561e8a1dc5f316cca3a33cb4a2b30082b5c87e8d03b69c0edf6447752" Apr 16 18:32:32.548334 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:32:32.548312 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05a5939561e8a1dc5f316cca3a33cb4a2b30082b5c87e8d03b69c0edf6447752\": container with ID starting with 05a5939561e8a1dc5f316cca3a33cb4a2b30082b5c87e8d03b69c0edf6447752 not found: ID does not exist" containerID="05a5939561e8a1dc5f316cca3a33cb4a2b30082b5c87e8d03b69c0edf6447752" Apr 16 18:32:32.548406 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.548340 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a5939561e8a1dc5f316cca3a33cb4a2b30082b5c87e8d03b69c0edf6447752"} err="failed to get container status \"05a5939561e8a1dc5f316cca3a33cb4a2b30082b5c87e8d03b69c0edf6447752\": rpc error: code = NotFound desc = could not find container \"05a5939561e8a1dc5f316cca3a33cb4a2b30082b5c87e8d03b69c0edf6447752\": container with ID starting with 05a5939561e8a1dc5f316cca3a33cb4a2b30082b5c87e8d03b69c0edf6447752 not found: ID does not exist" Apr 16 18:32:32.938189 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:32.938146 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" path="/var/lib/kubelet/pods/a5183c81-a13d-4da1-a55e-a1174b697e5c/volumes" Apr 16 18:32:33.440100 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:33.440055 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 18:32:37.614753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:37.614703 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-5596b59666-9h2h6"] Apr 16 18:32:37.615283 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:37.614998 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" podUID="77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8" containerName="manager" containerID="cri-o://daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547" gracePeriod=30 Apr 16 18:32:37.867004 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:37.866933 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" Apr 16 18:32:37.959615 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:37.959578 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-964hv\" (UniqueName: \"kubernetes.io/projected/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8-kube-api-access-964hv\") pod \"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8\" (UID: \"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8\") " Apr 16 18:32:37.959795 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:37.959656 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8-cert\") pod \"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8\" (UID: \"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8\") " Apr 16 18:32:37.962029 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:37.962000 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8-cert" (OuterVolumeSpecName: "cert") pod "77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8" (UID: "77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:32:37.962174 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:37.962136 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8-kube-api-access-964hv" (OuterVolumeSpecName: "kube-api-access-964hv") pod "77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8" (UID: "77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8"). InnerVolumeSpecName "kube-api-access-964hv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:32:38.060969 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:38.060926 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-964hv\" (UniqueName: \"kubernetes.io/projected/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8-kube-api-access-964hv\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:32:38.060969 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:38.060960 2574 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8-cert\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:32:38.479563 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:38.479526 2574 generic.go:358] "Generic (PLEG): container finished" podID="77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8" containerID="daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547" exitCode=0 Apr 16 18:32:38.479743 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:38.479597 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" Apr 16 18:32:38.479743 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:38.479623 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" event={"ID":"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8","Type":"ContainerDied","Data":"daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547"} Apr 16 18:32:38.479743 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:38.479666 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5596b59666-9h2h6" event={"ID":"77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8","Type":"ContainerDied","Data":"1018f065a7e9cfc15b5a6a2d8d3fa309fafbe1c6f3065304d10ef893d39702c0"} Apr 16 18:32:38.479743 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:38.479683 2574 scope.go:117] "RemoveContainer" containerID="daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547" Apr 16 18:32:38.488819 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:38.488790 2574 scope.go:117] "RemoveContainer" containerID="daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547" Apr 16 18:32:38.489130 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:32:38.489108 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547\": container with ID starting with daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547 not found: ID does not exist" containerID="daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547" Apr 16 18:32:38.489197 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:38.489137 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547"} err="failed to get container status \"daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547\": rpc error: code = NotFound desc = could not find container \"daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547\": container with ID starting with daf83bbaca171f725bc8bdee92f67893be949e0d9829e1572e504292c8e58547 not found: ID does not exist" Apr 16 18:32:38.502884 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:38.502850 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-5596b59666-9h2h6"] Apr 16 18:32:38.508572 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:38.508539 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-5596b59666-9h2h6"] Apr 16 18:32:38.938257 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:38.938226 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8" path="/var/lib/kubelet/pods/77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8/volumes" Apr 16 18:32:43.440520 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:43.440478 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 16 18:32:53.450189 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:53.450153 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:32:53.458529 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:32:53.458503 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:33:02.438933 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:02.438894 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl"] Apr 16 18:33:02.439355 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:02.439164 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" containerID="cri-o://2d83113aa14465b2242299c74aece19c3688f7cbe26518b94f7f815e136f5875" gracePeriod=30 Apr 16 18:33:13.836782 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.836744 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts"] Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837062 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8" containerName="manager" Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837073 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8" containerName="manager" Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837085 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837091 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837100 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="storage-initializer" Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837107 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="storage-initializer" Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837118 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="storage-initializer" Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837123 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="storage-initializer" Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837130 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837135 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837183 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="77eb27cc-6ac4-4adc-9f9a-bfb84cc33cf8" containerName="manager" Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837193 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d4b19f7-c2c6-43f0-a27f-8b1bef8edabd" containerName="main" Apr 16 18:33:13.837249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.837198 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5183c81-a13d-4da1-a55e-a1174b697e5c" containerName="main" Apr 16 18:33:13.841551 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.841519 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:13.842508 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.842485 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7"] Apr 16 18:33:13.844523 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.844487 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-pdzzs\"" Apr 16 18:33:13.844704 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.844687 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 18:33:13.846131 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.846111 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:13.855314 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.855277 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts"] Apr 16 18:33:13.857846 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.857801 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7"] Apr 16 18:33:13.963549 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.963508 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:13.963751 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.963564 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzj4g\" (UniqueName: \"kubernetes.io/projected/158967f4-4093-40cd-9f7c-a987d16c8a1c-kube-api-access-qzj4g\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:13.963751 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.963623 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:13.963751 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.963711 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:13.963751 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.963741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:13.964043 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.963773 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:13.964043 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.963800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:13.964043 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.963845 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:13.964043 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.963874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:13.964043 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.963894 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/158967f4-4093-40cd-9f7c-a987d16c8a1c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:13.964043 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.963913 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:13.964043 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:13.963938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5rqq\" (UniqueName: \"kubernetes.io/projected/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-kube-api-access-t5rqq\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.065110 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065066 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.065316 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065118 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.065316 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.065316 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065187 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/158967f4-4093-40cd-9f7c-a987d16c8a1c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.065316 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065212 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.065316 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5rqq\" (UniqueName: \"kubernetes.io/projected/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-kube-api-access-t5rqq\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.065316 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065280 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.065316 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzj4g\" (UniqueName: \"kubernetes.io/projected/158967f4-4093-40cd-9f7c-a987d16c8a1c-kube-api-access-qzj4g\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.065690 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065343 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.065690 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.065690 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065414 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.065690 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065449 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.065690 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065569 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.065690 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.065690 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.066095 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.065772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.066095 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.066007 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.066207 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.066148 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.068065 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.068034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.068198 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.068064 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.068198 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.068081 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/158967f4-4093-40cd-9f7c-a987d16c8a1c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.068198 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.068094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.083104 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.083067 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5rqq\" (UniqueName: \"kubernetes.io/projected/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-kube-api-access-t5rqq\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.085914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.085876 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzj4g\" (UniqueName: \"kubernetes.io/projected/158967f4-4093-40cd-9f7c-a987d16c8a1c-kube-api-access-qzj4g\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.154605 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.154564 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:14.164084 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.164055 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:14.298055 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.297908 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts"] Apr 16 18:33:14.300906 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:33:14.300874 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa1bfcf_592e_44ab_ac60_e9ee7850a8a4.slice/crio-4ab86e442a3413c76524812c14f9e57d3dfb1ad7bb01ab0aa6435b4a70f661b4 WatchSource:0}: Error finding container 4ab86e442a3413c76524812c14f9e57d3dfb1ad7bb01ab0aa6435b4a70f661b4: Status 404 returned error can't find the container with id 4ab86e442a3413c76524812c14f9e57d3dfb1ad7bb01ab0aa6435b4a70f661b4 Apr 16 18:33:14.319804 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.319778 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7"] Apr 16 18:33:14.321521 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:33:14.321499 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158967f4_4093_40cd_9f7c_a987d16c8a1c.slice/crio-3dd61b8021acaf5ccea7f5f1fa017cf4594b4a3abd7f69f4b8bd43bb2f62c7a5 WatchSource:0}: Error finding container 3dd61b8021acaf5ccea7f5f1fa017cf4594b4a3abd7f69f4b8bd43bb2f62c7a5: Status 404 returned error can't find the container with id 3dd61b8021acaf5ccea7f5f1fa017cf4594b4a3abd7f69f4b8bd43bb2f62c7a5 Apr 16 18:33:14.599228 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.599129 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" event={"ID":"158967f4-4093-40cd-9f7c-a987d16c8a1c","Type":"ContainerStarted","Data":"03508532ebee044d48ad510fd68429beda7b79af5574216ca772857a04e7f86c"} Apr 16 18:33:14.599228 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.599177 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" event={"ID":"158967f4-4093-40cd-9f7c-a987d16c8a1c","Type":"ContainerStarted","Data":"3dd61b8021acaf5ccea7f5f1fa017cf4594b4a3abd7f69f4b8bd43bb2f62c7a5"} Apr 16 18:33:14.600272 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:14.600248 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" event={"ID":"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4","Type":"ContainerStarted","Data":"4ab86e442a3413c76524812c14f9e57d3dfb1ad7bb01ab0aa6435b4a70f661b4"} Apr 16 18:33:15.605997 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:15.605874 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" event={"ID":"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4","Type":"ContainerStarted","Data":"218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165"} Apr 16 18:33:16.612404 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:16.612361 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" event={"ID":"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4","Type":"ContainerStarted","Data":"0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3"} Apr 16 18:33:16.613464 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:16.612608 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:19.578951 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.578913 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj"] Apr 16 18:33:19.584118 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.584086 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.587372 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.587346 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 16 18:33:19.596362 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.596332 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj"] Apr 16 18:33:19.625131 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.625095 2574 generic.go:358] "Generic (PLEG): container finished" podID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerID="03508532ebee044d48ad510fd68429beda7b79af5574216ca772857a04e7f86c" exitCode=0 Apr 16 18:33:19.625328 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.625150 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" event={"ID":"158967f4-4093-40cd-9f7c-a987d16c8a1c","Type":"ContainerDied","Data":"03508532ebee044d48ad510fd68429beda7b79af5574216ca772857a04e7f86c"} Apr 16 18:33:19.721626 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.721584 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w7k2\" (UniqueName: \"kubernetes.io/projected/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-kube-api-access-8w7k2\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.721805 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.721668 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.721805 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.721752 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.721914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.721805 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.721914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.721882 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.721914 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.721909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.823400 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.823362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8w7k2\" (UniqueName: \"kubernetes.io/projected/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-kube-api-access-8w7k2\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.823579 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.823409 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.823579 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.823443 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.823579 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.823479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.823579 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.823523 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.823579 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.823550 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.823950 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.823906 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.823950 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.823942 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.824062 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.824030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.825955 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.825929 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.826352 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.826331 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.834947 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.834885 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w7k2\" (UniqueName: \"kubernetes.io/projected/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-kube-api-access-8w7k2\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:19.895632 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:19.895586 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:20.047235 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:20.047125 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj"] Apr 16 18:33:20.050390 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:33:20.050338 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bb88b2a_1562_42d2_b39d_745d4c28d3e7.slice/crio-fd73aa61fab623dabab2614c61ff201e1c0818df74dca784cb92e9de231623df WatchSource:0}: Error finding container fd73aa61fab623dabab2614c61ff201e1c0818df74dca784cb92e9de231623df: Status 404 returned error can't find the container with id fd73aa61fab623dabab2614c61ff201e1c0818df74dca784cb92e9de231623df Apr 16 18:33:20.630884 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:20.630846 2574 generic.go:358] "Generic (PLEG): container finished" podID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerID="0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3" exitCode=0 Apr 16 18:33:20.630884 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:20.630875 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" event={"ID":"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4","Type":"ContainerDied","Data":"0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3"} Apr 16 18:33:20.632670 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:20.632644 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" event={"ID":"1bb88b2a-1562-42d2-b39d-745d4c28d3e7","Type":"ContainerStarted","Data":"90b33f28eb144ee53485a75bef521be00f9542b0bde334517ae49c7d3ef06abc"} Apr 16 18:33:20.632772 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:20.632677 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" event={"ID":"1bb88b2a-1562-42d2-b39d-745d4c28d3e7","Type":"ContainerStarted","Data":"fd73aa61fab623dabab2614c61ff201e1c0818df74dca784cb92e9de231623df"} Apr 16 18:33:20.634460 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:20.634441 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" event={"ID":"158967f4-4093-40cd-9f7c-a987d16c8a1c","Type":"ContainerStarted","Data":"c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c"} Apr 16 18:33:20.692631 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:20.692567 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podStartSLOduration=7.692545399 podStartE2EDuration="7.692545399s" podCreationTimestamp="2026-04-16 18:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:33:20.690971343 +0000 UTC m=+1420.346206217" watchObservedRunningTime="2026-04-16 18:33:20.692545399 +0000 UTC m=+1420.347780254" Apr 16 18:33:21.641464 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:21.641422 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" event={"ID":"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4","Type":"ContainerStarted","Data":"9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af"} Apr 16 18:33:21.672941 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:21.672771 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podStartSLOduration=7.762920845 podStartE2EDuration="8.67274772s" podCreationTimestamp="2026-04-16 18:33:13 +0000 UTC" firstStartedPulling="2026-04-16 18:33:14.302913699 +0000 UTC m=+1413.958148534" lastFinishedPulling="2026-04-16 18:33:15.212740575 +0000 UTC m=+1414.867975409" observedRunningTime="2026-04-16 18:33:21.670397678 +0000 UTC m=+1421.325632532" watchObservedRunningTime="2026-04-16 18:33:21.67274772 +0000 UTC m=+1421.327982575" Apr 16 18:33:24.155701 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:24.155654 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:24.156261 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:24.155719 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:24.157516 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:24.157461 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:33:24.165229 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:24.165191 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:24.165419 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:24.165245 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:33:24.166737 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:24.166697 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:33:24.655726 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:24.655692 2574 generic.go:358] "Generic (PLEG): container finished" podID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerID="90b33f28eb144ee53485a75bef521be00f9542b0bde334517ae49c7d3ef06abc" exitCode=0 Apr 16 18:33:24.655957 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:24.655751 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" event={"ID":"1bb88b2a-1562-42d2-b39d-745d4c28d3e7","Type":"ContainerDied","Data":"90b33f28eb144ee53485a75bef521be00f9542b0bde334517ae49c7d3ef06abc"} Apr 16 18:33:25.668594 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:25.668548 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" event={"ID":"1bb88b2a-1562-42d2-b39d-745d4c28d3e7","Type":"ContainerStarted","Data":"1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a"} Apr 16 18:33:25.694020 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:25.693962 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podStartSLOduration=6.6939473 podStartE2EDuration="6.6939473s" podCreationTimestamp="2026-04-16 18:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:33:25.691760887 +0000 UTC m=+1425.346995758" watchObservedRunningTime="2026-04-16 18:33:25.6939473 +0000 UTC m=+1425.349182131" Apr 16 18:33:29.895775 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:29.895729 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:29.896308 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:29.895837 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:33:29.897526 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:29.897494 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:33:32.698395 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.698260 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-b85596bcd-zghwl_c48175cf-0af6-418c-ba31-396e10369610/main/0.log" Apr 16 18:33:32.698866 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.698660 2574 generic.go:358] "Generic (PLEG): container finished" podID="c48175cf-0af6-418c-ba31-396e10369610" containerID="2d83113aa14465b2242299c74aece19c3688f7cbe26518b94f7f815e136f5875" exitCode=137 Apr 16 18:33:32.698866 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.698811 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" event={"ID":"c48175cf-0af6-418c-ba31-396e10369610","Type":"ContainerDied","Data":"2d83113aa14465b2242299c74aece19c3688f7cbe26518b94f7f815e136f5875"} Apr 16 18:33:32.770603 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.770575 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-b85596bcd-zghwl_c48175cf-0af6-418c-ba31-396e10369610/main/0.log" Apr 16 18:33:32.771075 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.771051 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:33:32.836099 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.836058 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-model-cache\") pod \"c48175cf-0af6-418c-ba31-396e10369610\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " Apr 16 18:33:32.836273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.836113 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-kserve-provision-location\") pod \"c48175cf-0af6-418c-ba31-396e10369610\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " Apr 16 18:33:32.836273 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.836242 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-home\") pod \"c48175cf-0af6-418c-ba31-396e10369610\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " Apr 16 18:33:32.836394 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.836305 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c48175cf-0af6-418c-ba31-396e10369610-tls-certs\") pod \"c48175cf-0af6-418c-ba31-396e10369610\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " Apr 16 18:33:32.836394 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.836345 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcgp8\" (UniqueName: \"kubernetes.io/projected/c48175cf-0af6-418c-ba31-396e10369610-kube-api-access-gcgp8\") pod \"c48175cf-0af6-418c-ba31-396e10369610\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " Apr 16 18:33:32.836394 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.836367 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-dshm\") pod \"c48175cf-0af6-418c-ba31-396e10369610\" (UID: \"c48175cf-0af6-418c-ba31-396e10369610\") " Apr 16 18:33:32.836550 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.836393 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-model-cache" (OuterVolumeSpecName: "model-cache") pod "c48175cf-0af6-418c-ba31-396e10369610" (UID: "c48175cf-0af6-418c-ba31-396e10369610"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:32.836756 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.836648 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:33:32.836887 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.836799 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-home" (OuterVolumeSpecName: "home") pod "c48175cf-0af6-418c-ba31-396e10369610" (UID: "c48175cf-0af6-418c-ba31-396e10369610"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:32.839004 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.838952 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c48175cf-0af6-418c-ba31-396e10369610-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c48175cf-0af6-418c-ba31-396e10369610" (UID: "c48175cf-0af6-418c-ba31-396e10369610"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:33:32.839249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.839203 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-dshm" (OuterVolumeSpecName: "dshm") pod "c48175cf-0af6-418c-ba31-396e10369610" (UID: "c48175cf-0af6-418c-ba31-396e10369610"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:32.839619 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.839586 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c48175cf-0af6-418c-ba31-396e10369610-kube-api-access-gcgp8" (OuterVolumeSpecName: "kube-api-access-gcgp8") pod "c48175cf-0af6-418c-ba31-396e10369610" (UID: "c48175cf-0af6-418c-ba31-396e10369610"). InnerVolumeSpecName "kube-api-access-gcgp8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:33:32.935701 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.935639 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c48175cf-0af6-418c-ba31-396e10369610" (UID: "c48175cf-0af6-418c-ba31-396e10369610"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:32.937300 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.937190 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:33:32.937300 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.937222 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c48175cf-0af6-418c-ba31-396e10369610-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:33:32.937300 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.937241 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gcgp8\" (UniqueName: \"kubernetes.io/projected/c48175cf-0af6-418c-ba31-396e10369610-kube-api-access-gcgp8\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:33:32.937300 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.937257 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:33:32.937300 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:32.937271 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c48175cf-0af6-418c-ba31-396e10369610-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:33:33.704700 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:33.704667 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-b85596bcd-zghwl_c48175cf-0af6-418c-ba31-396e10369610/main/0.log" Apr 16 18:33:33.705204 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:33.705119 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" event={"ID":"c48175cf-0af6-418c-ba31-396e10369610","Type":"ContainerDied","Data":"3ff86c6d334e988243a13f5d13a65c171006e77a487aba3b85dff16f6cd92e90"} Apr 16 18:33:33.705204 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:33.705164 2574 scope.go:117] "RemoveContainer" containerID="2d83113aa14465b2242299c74aece19c3688f7cbe26518b94f7f815e136f5875" Apr 16 18:33:33.705370 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:33.705342 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl" Apr 16 18:33:33.731561 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:33.731527 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl"] Apr 16 18:33:33.735445 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:33.735415 2574 scope.go:117] "RemoveContainer" containerID="e549d6abd4cbf92b6f3a07230d0dff61231f2c8944b9805b9a2e2ec2c5006661" Apr 16 18:33:33.736982 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:33.736724 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-b85596bcd-zghwl"] Apr 16 18:33:34.155881 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:34.155806 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:33:34.164965 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:34.164920 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:33:34.169238 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:34.169214 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:33:34.939543 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:34.939488 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c48175cf-0af6-418c-ba31-396e10369610" path="/var/lib/kubelet/pods/c48175cf-0af6-418c-ba31-396e10369610/volumes" Apr 16 18:33:39.896429 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:39.896380 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:33:44.156344 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:44.156297 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:33:44.164954 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:44.164907 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:33:49.897002 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:49.896946 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:33:54.155977 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:54.155927 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:33:54.165036 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:54.164985 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:33:59.896296 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:33:59.896242 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:34:04.155852 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:04.155769 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:34:04.164968 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:04.164918 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:34:09.896774 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:09.896725 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:34:14.155485 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:14.155434 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:34:14.165258 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:14.165210 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:34:19.896575 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:19.896527 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:34:24.155839 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:24.155768 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:34:24.164903 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:24.164864 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:34:29.896962 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:29.896914 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:34:34.155481 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:34.155428 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:34:34.164960 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:34.164921 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:34:39.896276 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:39.896230 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:34:40.910188 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:40.910151 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:34:40.916092 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:40.916058 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:34:40.920867 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:40.920816 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:34:40.923554 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:40.923531 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:34:44.155576 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:44.155531 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:34:44.165135 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:44.165100 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:34:49.896865 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:49.896787 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:34:54.155062 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:54.155013 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:34:54.165075 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:54.165033 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:34:59.896636 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:34:59.896581 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:35:04.155951 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:04.155890 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:35:04.164627 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:04.164585 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:35:09.896470 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:09.896421 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:35:14.156034 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:14.155989 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:35:14.164987 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:14.164947 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:35:19.896212 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:19.896129 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:35:24.155267 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:24.155214 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:35:24.164609 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:24.164564 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:35:29.896499 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:29.896448 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:35:34.155710 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:34.155666 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:35:34.165309 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:34.165277 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:35:39.896907 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:39.896854 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:35:44.155316 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:44.155264 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:35:44.164574 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:44.164522 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:35:49.896731 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:49.896671 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:35:54.155682 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:54.155624 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:35:54.164709 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:54.164669 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:35:59.896307 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:35:59.896260 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:36:04.155915 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:04.155870 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:36:04.164469 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:04.164428 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:36:09.896623 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:09.896577 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:36:14.155619 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:14.155569 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:36:14.164696 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:14.164658 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:36:19.896395 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:19.896339 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:36:24.155054 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:24.155003 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 16 18:36:24.166310 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:24.166250 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 16 18:36:29.896694 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:29.896652 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 16 18:36:34.165396 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:34.165361 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:36:34.180474 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:34.180442 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:36:34.182294 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:34.182266 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:36:34.188420 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:34.188402 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:36:39.905870 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:39.905813 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:36:39.913849 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:39.913796 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:36:48.070676 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:48.070621 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj"] Apr 16 18:36:48.072222 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:36:48.072144 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" containerID="cri-o://1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a" gracePeriod=30 Apr 16 18:37:07.595008 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.594971 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:37:07.595490 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.595298 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" Apr 16 18:37:07.595490 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.595310 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" Apr 16 18:37:07.595490 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.595327 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="storage-initializer" Apr 16 18:37:07.595490 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.595335 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="storage-initializer" Apr 16 18:37:07.595490 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.595395 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c48175cf-0af6-418c-ba31-396e10369610" containerName="main" Apr 16 18:37:07.600754 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.600725 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.604470 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.604434 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-6czxt\"" Apr 16 18:37:07.604470 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.604439 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 18:37:07.616023 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.615992 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:37:07.700016 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.699970 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.700206 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.700037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.700206 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.700102 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e421425-027c-465c-a1f0-e16d1f6f7266-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.700206 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.700144 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwd7c\" (UniqueName: \"kubernetes.io/projected/5e421425-027c-465c-a1f0-e16d1f6f7266-kube-api-access-vwd7c\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.700206 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.700195 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.700374 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.700252 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.800926 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.800888 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.801122 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.800938 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.801122 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.800963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e421425-027c-465c-a1f0-e16d1f6f7266-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.801122 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.800988 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwd7c\" (UniqueName: \"kubernetes.io/projected/5e421425-027c-465c-a1f0-e16d1f6f7266-kube-api-access-vwd7c\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.801122 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.801016 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.801122 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.801050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.801393 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.801359 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.801456 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.801367 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.801456 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.801442 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.803486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.803452 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.803647 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.803631 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e421425-027c-465c-a1f0-e16d1f6f7266-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.813892 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.813861 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwd7c\" (UniqueName: \"kubernetes.io/projected/5e421425-027c-465c-a1f0-e16d1f6f7266-kube-api-access-vwd7c\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:07.911159 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:07.911118 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:08.049759 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:08.049672 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:37:08.052286 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:37:08.052254 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e421425_027c_465c_a1f0_e16d1f6f7266.slice/crio-f3fc8c47efe0ad276c3c8b3a9883fb4280d6cc5854eedc5973b8c8a3da14f859 WatchSource:0}: Error finding container f3fc8c47efe0ad276c3c8b3a9883fb4280d6cc5854eedc5973b8c8a3da14f859: Status 404 returned error can't find the container with id f3fc8c47efe0ad276c3c8b3a9883fb4280d6cc5854eedc5973b8c8a3da14f859 Apr 16 18:37:08.060131 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:08.055198 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:37:08.482542 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:08.482499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"5e421425-027c-465c-a1f0-e16d1f6f7266","Type":"ContainerStarted","Data":"26982bada79faa1089a182cb4428f039e5531aad4507575efacf34075f018106"} Apr 16 18:37:08.482542 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:08.482549 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"5e421425-027c-465c-a1f0-e16d1f6f7266","Type":"ContainerStarted","Data":"f3fc8c47efe0ad276c3c8b3a9883fb4280d6cc5854eedc5973b8c8a3da14f859"} Apr 16 18:37:13.503961 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:13.503929 2574 generic.go:358] "Generic (PLEG): container finished" podID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerID="26982bada79faa1089a182cb4428f039e5531aad4507575efacf34075f018106" exitCode=0 Apr 16 18:37:13.504346 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:13.504004 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"5e421425-027c-465c-a1f0-e16d1f6f7266","Type":"ContainerDied","Data":"26982bada79faa1089a182cb4428f039e5531aad4507575efacf34075f018106"} Apr 16 18:37:14.510235 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:14.510201 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"5e421425-027c-465c-a1f0-e16d1f6f7266","Type":"ContainerStarted","Data":"ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a"} Apr 16 18:37:14.533807 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:14.533738 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=7.533714368 podStartE2EDuration="7.533714368s" podCreationTimestamp="2026-04-16 18:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:14.53156821 +0000 UTC m=+1654.186803063" watchObservedRunningTime="2026-04-16 18:37:14.533714368 +0000 UTC m=+1654.188949221" Apr 16 18:37:17.911472 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:17.911433 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:17.913228 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:17.913197 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:37:18.358548 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.358521 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj_1bb88b2a-1562-42d2-b39d-745d4c28d3e7/main/0.log" Apr 16 18:37:18.358960 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.358940 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:37:18.400145 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.400107 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w7k2\" (UniqueName: \"kubernetes.io/projected/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-kube-api-access-8w7k2\") pod \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " Apr 16 18:37:18.400341 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.400161 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-kserve-provision-location\") pod \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " Apr 16 18:37:18.400341 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.400203 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-dshm\") pod \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " Apr 16 18:37:18.400341 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.400247 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-model-cache\") pod \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " Apr 16 18:37:18.400341 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.400298 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-home\") pod \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " Apr 16 18:37:18.400549 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.400370 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-tls-certs\") pod \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\" (UID: \"1bb88b2a-1562-42d2-b39d-745d4c28d3e7\") " Apr 16 18:37:18.401353 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.401189 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-home" (OuterVolumeSpecName: "home") pod "1bb88b2a-1562-42d2-b39d-745d4c28d3e7" (UID: "1bb88b2a-1562-42d2-b39d-745d4c28d3e7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:18.401353 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.401265 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-model-cache" (OuterVolumeSpecName: "model-cache") pod "1bb88b2a-1562-42d2-b39d-745d4c28d3e7" (UID: "1bb88b2a-1562-42d2-b39d-745d4c28d3e7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:18.402872 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.402797 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-kube-api-access-8w7k2" (OuterVolumeSpecName: "kube-api-access-8w7k2") pod "1bb88b2a-1562-42d2-b39d-745d4c28d3e7" (UID: "1bb88b2a-1562-42d2-b39d-745d4c28d3e7"). InnerVolumeSpecName "kube-api-access-8w7k2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:37:18.403002 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.402894 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-dshm" (OuterVolumeSpecName: "dshm") pod "1bb88b2a-1562-42d2-b39d-745d4c28d3e7" (UID: "1bb88b2a-1562-42d2-b39d-745d4c28d3e7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:18.403385 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.403349 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1bb88b2a-1562-42d2-b39d-745d4c28d3e7" (UID: "1bb88b2a-1562-42d2-b39d-745d4c28d3e7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:37:18.454405 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.454311 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1bb88b2a-1562-42d2-b39d-745d4c28d3e7" (UID: "1bb88b2a-1562-42d2-b39d-745d4c28d3e7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:18.501638 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.501599 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8w7k2\" (UniqueName: \"kubernetes.io/projected/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-kube-api-access-8w7k2\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:37:18.501638 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.501635 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:37:18.501891 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.501650 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:37:18.501891 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.501663 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:37:18.501891 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.501675 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:37:18.501891 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.501687 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb88b2a-1562-42d2-b39d-745d4c28d3e7-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:37:18.524744 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.524717 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj_1bb88b2a-1562-42d2-b39d-745d4c28d3e7/main/0.log" Apr 16 18:37:18.525108 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.525081 2574 generic.go:358] "Generic (PLEG): container finished" podID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerID="1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a" exitCode=137 Apr 16 18:37:18.525178 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.525158 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" Apr 16 18:37:18.525222 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.525164 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" event={"ID":"1bb88b2a-1562-42d2-b39d-745d4c28d3e7","Type":"ContainerDied","Data":"1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a"} Apr 16 18:37:18.525222 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.525211 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj" event={"ID":"1bb88b2a-1562-42d2-b39d-745d4c28d3e7","Type":"ContainerDied","Data":"fd73aa61fab623dabab2614c61ff201e1c0818df74dca784cb92e9de231623df"} Apr 16 18:37:18.525307 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.525236 2574 scope.go:117] "RemoveContainer" containerID="1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a" Apr 16 18:37:18.553090 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.553056 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj"] Apr 16 18:37:18.553479 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.553464 2574 scope.go:117] "RemoveContainer" containerID="90b33f28eb144ee53485a75bef521be00f9542b0bde334517ae49c7d3ef06abc" Apr 16 18:37:18.558388 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.558358 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7fcccd6c8drpbtj"] Apr 16 18:37:18.613936 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.613912 2574 scope.go:117] "RemoveContainer" containerID="1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a" Apr 16 18:37:18.614304 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:37:18.614281 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a\": container with ID starting with 1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a not found: ID does not exist" containerID="1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a" Apr 16 18:37:18.614402 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.614313 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a"} err="failed to get container status \"1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a\": rpc error: code = NotFound desc = could not find container \"1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a\": container with ID starting with 1a4721991dbdc1ec47b431abc0478c5dd5ceaf18e492f3f222a436bf045a084a not found: ID does not exist" Apr 16 18:37:18.614402 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.614335 2574 scope.go:117] "RemoveContainer" containerID="90b33f28eb144ee53485a75bef521be00f9542b0bde334517ae49c7d3ef06abc" Apr 16 18:37:18.614619 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:37:18.614603 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b33f28eb144ee53485a75bef521be00f9542b0bde334517ae49c7d3ef06abc\": container with ID starting with 90b33f28eb144ee53485a75bef521be00f9542b0bde334517ae49c7d3ef06abc not found: ID does not exist" containerID="90b33f28eb144ee53485a75bef521be00f9542b0bde334517ae49c7d3ef06abc" Apr 16 18:37:18.614659 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.614623 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b33f28eb144ee53485a75bef521be00f9542b0bde334517ae49c7d3ef06abc"} err="failed to get container status \"90b33f28eb144ee53485a75bef521be00f9542b0bde334517ae49c7d3ef06abc\": rpc error: code = NotFound desc = could not find container \"90b33f28eb144ee53485a75bef521be00f9542b0bde334517ae49c7d3ef06abc\": container with ID starting with 90b33f28eb144ee53485a75bef521be00f9542b0bde334517ae49c7d3ef06abc not found: ID does not exist" Apr 16 18:37:18.938415 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:18.938380 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" path="/var/lib/kubelet/pods/1bb88b2a-1562-42d2-b39d-745d4c28d3e7/volumes" Apr 16 18:37:27.911893 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:27.911841 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:37:31.342258 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.342221 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx"] Apr 16 18:37:31.342658 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.342565 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="storage-initializer" Apr 16 18:37:31.342658 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.342576 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="storage-initializer" Apr 16 18:37:31.342658 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.342590 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" Apr 16 18:37:31.342658 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.342596 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" Apr 16 18:37:31.342658 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.342657 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1bb88b2a-1562-42d2-b39d-745d4c28d3e7" containerName="main" Apr 16 18:37:31.345765 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.345746 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.348396 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.348369 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 18:37:31.348553 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.348369 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-zlvzh\"" Apr 16 18:37:31.353135 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.353105 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg"] Apr 16 18:37:31.357301 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.357269 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.360814 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.360785 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx"] Apr 16 18:37:31.373243 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.373208 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg"] Apr 16 18:37:31.422744 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.422700 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.422744 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.422748 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr99x\" (UniqueName: \"kubernetes.io/projected/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-kube-api-access-mr99x\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.423006 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.422772 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.423006 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.422870 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzxld\" (UniqueName: \"kubernetes.io/projected/49f3223a-0126-4965-9f3b-99fee62af21a-kube-api-access-xzxld\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.423006 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.422916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.423006 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.422937 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-home\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.423006 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.422968 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.423191 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.423029 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-dshm\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.423191 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.423052 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49f3223a-0126-4965-9f3b-99fee62af21a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.423191 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.423076 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.423191 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.423093 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.423191 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.423112 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.524001 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.523956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-dshm\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.524001 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524004 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49f3223a-0126-4965-9f3b-99fee62af21a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.524266 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.524266 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524063 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.524266 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524089 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.524266 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524114 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.524266 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr99x\" (UniqueName: \"kubernetes.io/projected/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-kube-api-access-mr99x\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.524746 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524303 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.524746 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524361 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzxld\" (UniqueName: \"kubernetes.io/projected/49f3223a-0126-4965-9f3b-99fee62af21a-kube-api-access-xzxld\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.524746 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.524746 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524452 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-home\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.524746 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.524746 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524511 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.525101 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524779 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.525101 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524861 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.525101 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524871 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.525101 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.524452 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.525101 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.525027 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-home\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.527373 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.527328 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49f3223a-0126-4965-9f3b-99fee62af21a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.527496 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.527427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.527623 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.527605 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.527873 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.527816 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-dshm\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.534906 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.534858 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr99x\" (UniqueName: \"kubernetes.io/projected/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-kube-api-access-mr99x\") pod \"custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.535027 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.534939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzxld\" (UniqueName: \"kubernetes.io/projected/49f3223a-0126-4965-9f3b-99fee62af21a-kube-api-access-xzxld\") pod \"custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.657031 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.656991 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:31.671139 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.671069 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:31.830486 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.830454 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx"] Apr 16 18:37:31.833644 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:37:31.833608 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49f3223a_0126_4965_9f3b_99fee62af21a.slice/crio-8b1f228860e5a006a77f42117b4dfdf416aaa6519d353a64a9dbfb506315a905 WatchSource:0}: Error finding container 8b1f228860e5a006a77f42117b4dfdf416aaa6519d353a64a9dbfb506315a905: Status 404 returned error can't find the container with id 8b1f228860e5a006a77f42117b4dfdf416aaa6519d353a64a9dbfb506315a905 Apr 16 18:37:31.850360 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:31.850333 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg"] Apr 16 18:37:31.854214 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:37:31.854179 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5b23a33_a9f2_4d3f_9859_1df515f3b2d2.slice/crio-e2cc0d18ac04bf6cfd7450dead1f49de43c25330467c5b901ae4bd02c21f8b70 WatchSource:0}: Error finding container e2cc0d18ac04bf6cfd7450dead1f49de43c25330467c5b901ae4bd02c21f8b70: Status 404 returned error can't find the container with id e2cc0d18ac04bf6cfd7450dead1f49de43c25330467c5b901ae4bd02c21f8b70 Apr 16 18:37:32.583887 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:32.583849 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" event={"ID":"49f3223a-0126-4965-9f3b-99fee62af21a","Type":"ContainerStarted","Data":"e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9"} Apr 16 18:37:32.583887 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:32.583891 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" event={"ID":"49f3223a-0126-4965-9f3b-99fee62af21a","Type":"ContainerStarted","Data":"8b1f228860e5a006a77f42117b4dfdf416aaa6519d353a64a9dbfb506315a905"} Apr 16 18:37:32.584555 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:32.583971 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:32.585379 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:32.585351 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" event={"ID":"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2","Type":"ContainerStarted","Data":"013aa86c715dff86b6d800252fba0362e31daf2d4e9af8782a876c896b57cc62"} Apr 16 18:37:32.585511 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:32.585386 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" event={"ID":"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2","Type":"ContainerStarted","Data":"e2cc0d18ac04bf6cfd7450dead1f49de43c25330467c5b901ae4bd02c21f8b70"} Apr 16 18:37:33.590619 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:33.590580 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" event={"ID":"49f3223a-0126-4965-9f3b-99fee62af21a","Type":"ContainerStarted","Data":"76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776"} Apr 16 18:37:34.241463 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:34.240797 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7"] Apr 16 18:37:34.241463 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:34.241254 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" containerID="cri-o://c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c" gracePeriod=30 Apr 16 18:37:34.243732 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:34.243677 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts"] Apr 16 18:37:34.244205 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:34.244161 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" containerID="cri-o://9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af" gracePeriod=30 Apr 16 18:37:37.610319 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:37.610280 2574 generic.go:358] "Generic (PLEG): container finished" podID="49f3223a-0126-4965-9f3b-99fee62af21a" containerID="76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776" exitCode=0 Apr 16 18:37:37.610841 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:37.610354 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" event={"ID":"49f3223a-0126-4965-9f3b-99fee62af21a","Type":"ContainerDied","Data":"76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776"} Apr 16 18:37:37.611776 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:37.611747 2574 generic.go:358] "Generic (PLEG): container finished" podID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerID="013aa86c715dff86b6d800252fba0362e31daf2d4e9af8782a876c896b57cc62" exitCode=0 Apr 16 18:37:37.611873 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:37.611807 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" event={"ID":"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2","Type":"ContainerDied","Data":"013aa86c715dff86b6d800252fba0362e31daf2d4e9af8782a876c896b57cc62"} Apr 16 18:37:37.911983 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:37.911925 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:37:37.912298 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:37.912267 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:37:38.618566 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:38.618528 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" event={"ID":"49f3223a-0126-4965-9f3b-99fee62af21a","Type":"ContainerStarted","Data":"4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424"} Apr 16 18:37:38.620489 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:38.620458 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" event={"ID":"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2","Type":"ContainerStarted","Data":"16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1"} Apr 16 18:37:38.644681 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:38.644618 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podStartSLOduration=7.644594698 podStartE2EDuration="7.644594698s" podCreationTimestamp="2026-04-16 18:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:38.641592613 +0000 UTC m=+1678.296827503" watchObservedRunningTime="2026-04-16 18:37:38.644594698 +0000 UTC m=+1678.299829551" Apr 16 18:37:38.664568 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:38.664501 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podStartSLOduration=7.664482268 podStartE2EDuration="7.664482268s" podCreationTimestamp="2026-04-16 18:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:38.662471855 +0000 UTC m=+1678.317706708" watchObservedRunningTime="2026-04-16 18:37:38.664482268 +0000 UTC m=+1678.319717122" Apr 16 18:37:41.657719 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:41.657674 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:41.658148 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:41.657729 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:41.659325 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:41.659290 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:37:41.671250 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:41.671206 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:41.671250 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:41.671252 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:37:41.672946 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:41.672910 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:37:41.673671 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:41.673650 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:37:47.912652 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:47.912595 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:37:51.658617 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:51.658389 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:37:51.672317 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:51.672269 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:37:57.911670 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:37:57.911612 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:38:01.657892 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:01.657807 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:38:01.671852 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:01.671788 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:38:04.244694 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.244644 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="llm-d-routing-sidecar" containerID="cri-o://218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165" gracePeriod=2 Apr 16 18:38:04.693275 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.693244 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts_1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4/main/0.log" Apr 16 18:38:04.694107 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.694079 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:38:04.698518 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.698494 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:38:04.735052 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.735024 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts_1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4/main/0.log" Apr 16 18:38:04.735840 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.735790 2574 generic.go:358] "Generic (PLEG): container finished" podID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerID="9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af" exitCode=137 Apr 16 18:38:04.735840 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.735821 2574 generic.go:358] "Generic (PLEG): container finished" podID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerID="218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165" exitCode=0 Apr 16 18:38:04.736048 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.735907 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" Apr 16 18:38:04.736048 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.736007 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" event={"ID":"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4","Type":"ContainerDied","Data":"9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af"} Apr 16 18:38:04.736048 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.736040 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" event={"ID":"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4","Type":"ContainerDied","Data":"218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165"} Apr 16 18:38:04.736211 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.736056 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts" event={"ID":"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4","Type":"ContainerDied","Data":"4ab86e442a3413c76524812c14f9e57d3dfb1ad7bb01ab0aa6435b4a70f661b4"} Apr 16 18:38:04.736211 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.736076 2574 scope.go:117] "RemoveContainer" containerID="9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af" Apr 16 18:38:04.738296 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.738228 2574 generic.go:358] "Generic (PLEG): container finished" podID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerID="c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c" exitCode=137 Apr 16 18:38:04.738296 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.738288 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" event={"ID":"158967f4-4093-40cd-9f7c-a987d16c8a1c","Type":"ContainerDied","Data":"c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c"} Apr 16 18:38:04.738649 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.738316 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" event={"ID":"158967f4-4093-40cd-9f7c-a987d16c8a1c","Type":"ContainerDied","Data":"3dd61b8021acaf5ccea7f5f1fa017cf4594b4a3abd7f69f4b8bd43bb2f62c7a5"} Apr 16 18:38:04.738649 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.738321 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7" Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.761491 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-home\") pod \"158967f4-4093-40cd-9f7c-a987d16c8a1c\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.761546 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-tls-certs\") pod \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.761597 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-kserve-provision-location\") pod \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.761647 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-dshm\") pod \"158967f4-4093-40cd-9f7c-a987d16c8a1c\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.761673 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-kserve-provision-location\") pod \"158967f4-4093-40cd-9f7c-a987d16c8a1c\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.761698 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-model-cache\") pod \"158967f4-4093-40cd-9f7c-a987d16c8a1c\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.761758 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzj4g\" (UniqueName: \"kubernetes.io/projected/158967f4-4093-40cd-9f7c-a987d16c8a1c-kube-api-access-qzj4g\") pod \"158967f4-4093-40cd-9f7c-a987d16c8a1c\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.761782 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5rqq\" (UniqueName: \"kubernetes.io/projected/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-kube-api-access-t5rqq\") pod \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.761807 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-home\") pod \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.761882 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/158967f4-4093-40cd-9f7c-a987d16c8a1c-tls-certs\") pod \"158967f4-4093-40cd-9f7c-a987d16c8a1c\" (UID: \"158967f4-4093-40cd-9f7c-a987d16c8a1c\") " Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.761925 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-dshm\") pod \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.761949 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-model-cache\") pod \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\" (UID: \"1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4\") " Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.762009 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-home" (OuterVolumeSpecName: "home") pod "158967f4-4093-40cd-9f7c-a987d16c8a1c" (UID: "158967f4-4093-40cd-9f7c-a987d16c8a1c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.762229 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:38:04.762767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.762431 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-model-cache" (OuterVolumeSpecName: "model-cache") pod "158967f4-4093-40cd-9f7c-a987d16c8a1c" (UID: "158967f4-4093-40cd-9f7c-a987d16c8a1c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:38:04.764208 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.763924 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-home" (OuterVolumeSpecName: "home") pod "1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" (UID: "1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:38:04.764556 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.764417 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-model-cache" (OuterVolumeSpecName: "model-cache") pod "1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" (UID: "1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:38:04.767636 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.767242 2574 scope.go:117] "RemoveContainer" containerID="0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3" Apr 16 18:38:04.770030 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.769815 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158967f4-4093-40cd-9f7c-a987d16c8a1c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "158967f4-4093-40cd-9f7c-a987d16c8a1c" (UID: "158967f4-4093-40cd-9f7c-a987d16c8a1c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:38:04.770195 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.770168 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" (UID: "1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:38:04.770722 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.770679 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158967f4-4093-40cd-9f7c-a987d16c8a1c-kube-api-access-qzj4g" (OuterVolumeSpecName: "kube-api-access-qzj4g") pod "158967f4-4093-40cd-9f7c-a987d16c8a1c" (UID: "158967f4-4093-40cd-9f7c-a987d16c8a1c"). InnerVolumeSpecName "kube-api-access-qzj4g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:38:04.771108 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.771065 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-dshm" (OuterVolumeSpecName: "dshm") pod "158967f4-4093-40cd-9f7c-a987d16c8a1c" (UID: "158967f4-4093-40cd-9f7c-a987d16c8a1c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:38:04.772243 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.772201 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-dshm" (OuterVolumeSpecName: "dshm") pod "1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" (UID: "1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:38:04.776335 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.776296 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-kube-api-access-t5rqq" (OuterVolumeSpecName: "kube-api-access-t5rqq") pod "1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" (UID: "1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4"). InnerVolumeSpecName "kube-api-access-t5rqq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:38:04.850742 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.850705 2574 scope.go:117] "RemoveContainer" containerID="218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165" Apr 16 18:38:04.853231 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.853188 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" (UID: "1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:38:04.859935 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.859886 2574 scope.go:117] "RemoveContainer" containerID="9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af" Apr 16 18:38:04.860332 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:38:04.860299 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af\": container with ID starting with 9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af not found: ID does not exist" containerID="9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af" Apr 16 18:38:04.860410 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.860347 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af"} err="failed to get container status \"9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af\": rpc error: code = NotFound desc = could not find container \"9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af\": container with ID starting with 9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af not found: ID does not exist" Apr 16 18:38:04.860410 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.860392 2574 scope.go:117] "RemoveContainer" containerID="0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3" Apr 16 18:38:04.860743 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:38:04.860711 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3\": container with ID starting with 0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3 not found: ID does not exist" containerID="0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3" Apr 16 18:38:04.861051 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.860753 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3"} err="failed to get container status \"0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3\": rpc error: code = NotFound desc = could not find container \"0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3\": container with ID starting with 0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3 not found: ID does not exist" Apr 16 18:38:04.861051 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.860769 2574 scope.go:117] "RemoveContainer" containerID="218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165" Apr 16 18:38:04.861205 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:38:04.861055 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165\": container with ID starting with 218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165 not found: ID does not exist" containerID="218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165" Apr 16 18:38:04.861205 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.861085 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165"} err="failed to get container status \"218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165\": rpc error: code = NotFound desc = could not find container \"218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165\": container with ID starting with 218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165 not found: ID does not exist" Apr 16 18:38:04.861205 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.861104 2574 scope.go:117] "RemoveContainer" containerID="9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af" Apr 16 18:38:04.861382 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.861365 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af"} err="failed to get container status \"9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af\": rpc error: code = NotFound desc = could not find container \"9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af\": container with ID starting with 9f6a0361d7415593a509bdeb35d418dcda30d0be576d0a29a41339e51583f3af not found: ID does not exist" Apr 16 18:38:04.861442 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.861386 2574 scope.go:117] "RemoveContainer" containerID="0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3" Apr 16 18:38:04.861641 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.861611 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3"} err="failed to get container status \"0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3\": rpc error: code = NotFound desc = could not find container \"0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3\": container with ID starting with 0f80c59cf229fdd78bb2261d422a8d5083adbaf4fd9607daae66521e4642e0b3 not found: ID does not exist" Apr 16 18:38:04.861641 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.861640 2574 scope.go:117] "RemoveContainer" containerID="218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165" Apr 16 18:38:04.861900 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.861882 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165"} err="failed to get container status \"218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165\": rpc error: code = NotFound desc = could not find container \"218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165\": container with ID starting with 218ba5ee943a80826574dd3b086d8102c477eac9e8fe213b1675ad681ebdd165 not found: ID does not exist" Apr 16 18:38:04.861900 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.861900 2574 scope.go:117] "RemoveContainer" containerID="c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c" Apr 16 18:38:04.862740 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.862717 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/158967f4-4093-40cd-9f7c-a987d16c8a1c-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:38:04.863024 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.863001 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:38:04.863130 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.863028 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:38:04.863130 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.863043 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:38:04.863130 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.863057 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:38:04.863130 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.863074 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:38:04.863130 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.863089 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:38:04.863130 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.863103 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzj4g\" (UniqueName: \"kubernetes.io/projected/158967f4-4093-40cd-9f7c-a987d16c8a1c-kube-api-access-qzj4g\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:38:04.863130 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.863117 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t5rqq\" (UniqueName: \"kubernetes.io/projected/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-kube-api-access-t5rqq\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:38:04.863130 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.863131 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:38:04.870452 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.870405 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "158967f4-4093-40cd-9f7c-a987d16c8a1c" (UID: "158967f4-4093-40cd-9f7c-a987d16c8a1c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:38:04.890165 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.890130 2574 scope.go:117] "RemoveContainer" containerID="03508532ebee044d48ad510fd68429beda7b79af5574216ca772857a04e7f86c" Apr 16 18:38:04.964471 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.964300 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/158967f4-4093-40cd-9f7c-a987d16c8a1c-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:38:04.969293 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.969267 2574 scope.go:117] "RemoveContainer" containerID="c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c" Apr 16 18:38:04.969660 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:38:04.969631 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c\": container with ID starting with c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c not found: ID does not exist" containerID="c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c" Apr 16 18:38:04.969788 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.969669 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c"} err="failed to get container status \"c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c\": rpc error: code = NotFound desc = could not find container \"c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c\": container with ID starting with c5a2f2100cc36f64e0866a2bc847a62d15196ab003c8c3e09a361e0b46c0a85c not found: ID does not exist" Apr 16 18:38:04.969788 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.969690 2574 scope.go:117] "RemoveContainer" containerID="03508532ebee044d48ad510fd68429beda7b79af5574216ca772857a04e7f86c" Apr 16 18:38:04.970130 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:38:04.970109 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03508532ebee044d48ad510fd68429beda7b79af5574216ca772857a04e7f86c\": container with ID starting with 03508532ebee044d48ad510fd68429beda7b79af5574216ca772857a04e7f86c not found: ID does not exist" containerID="03508532ebee044d48ad510fd68429beda7b79af5574216ca772857a04e7f86c" Apr 16 18:38:04.970197 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:04.970136 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03508532ebee044d48ad510fd68429beda7b79af5574216ca772857a04e7f86c"} err="failed to get container status \"03508532ebee044d48ad510fd68429beda7b79af5574216ca772857a04e7f86c\": rpc error: code = NotFound desc = could not find container \"03508532ebee044d48ad510fd68429beda7b79af5574216ca772857a04e7f86c\": container with ID starting with 03508532ebee044d48ad510fd68429beda7b79af5574216ca772857a04e7f86c not found: ID does not exist" Apr 16 18:38:04.996217 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:38:04.996152 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158967f4_4093_40cd_9f7c_a987d16c8a1c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa1bfcf_592e_44ab_ac60_e9ee7850a8a4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158967f4_4093_40cd_9f7c_a987d16c8a1c.slice/crio-3dd61b8021acaf5ccea7f5f1fa017cf4594b4a3abd7f69f4b8bd43bb2f62c7a5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa1bfcf_592e_44ab_ac60_e9ee7850a8a4.slice/crio-4ab86e442a3413c76524812c14f9e57d3dfb1ad7bb01ab0aa6435b4a70f661b4\": RecentStats: unable to find data in memory cache]" Apr 16 18:38:05.074800 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:05.074761 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7"] Apr 16 18:38:05.081343 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:05.081308 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5ct9gf7"] Apr 16 18:38:05.106958 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:05.106865 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts"] Apr 16 18:38:05.113442 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:05.113398 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-67474c68dcsqvts"] Apr 16 18:38:06.939794 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:06.939742 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" path="/var/lib/kubelet/pods/158967f4-4093-40cd-9f7c-a987d16c8a1c/volumes" Apr 16 18:38:06.940454 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:06.940426 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" path="/var/lib/kubelet/pods/1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4/volumes" Apr 16 18:38:07.912636 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:07.912587 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:38:11.658488 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:11.658432 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:38:11.672310 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:11.672252 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:38:17.912112 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:17.912060 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:38:21.658351 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:21.658304 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:38:21.671585 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:21.671537 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:38:27.912321 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:27.912265 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:38:31.657890 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:31.657816 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:38:31.671590 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:31.671539 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:38:37.912144 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:37.912090 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:38:41.657779 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:41.657724 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:38:41.671624 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:41.671578 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:38:47.911714 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:47.911654 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:38:51.658458 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:51.658409 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:38:51.671642 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:51.671603 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:38:57.912005 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:38:57.911952 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:39:01.658515 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:01.658471 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:39:01.671664 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:01.671615 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:39:07.912089 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:07.912040 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:39:11.658182 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:11.658137 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:39:11.671663 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:11.671625 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:39:17.912565 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:17.912515 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:39:21.657849 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:21.657722 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:39:21.672209 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:21.672158 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:39:27.912333 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:27.912274 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 16 18:39:31.658582 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:31.658532 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:39:31.671741 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:31.671700 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:39:37.921481 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:37.921431 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:39:37.930410 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:37.930383 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:39:40.946612 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:40.946578 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:39:40.948426 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:40.948397 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:39:40.952814 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:40.952789 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:39:40.954359 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:40.954336 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:39:41.657860 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:41.657794 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:39:41.672531 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:41.672434 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:39:51.658456 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:51.658402 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:39:51.672056 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:51.672014 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:39:55.248681 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:55.248639 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:39:55.249145 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:55.248984 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" containerID="cri-o://ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a" gracePeriod=30 Apr 16 18:39:56.599939 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.599913 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:39:56.679378 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.679259 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-home\") pod \"5e421425-027c-465c-a1f0-e16d1f6f7266\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " Apr 16 18:39:56.679378 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.679330 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-model-cache\") pod \"5e421425-027c-465c-a1f0-e16d1f6f7266\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " Apr 16 18:39:56.679378 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.679346 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-dshm\") pod \"5e421425-027c-465c-a1f0-e16d1f6f7266\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " Apr 16 18:39:56.679378 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.679381 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwd7c\" (UniqueName: \"kubernetes.io/projected/5e421425-027c-465c-a1f0-e16d1f6f7266-kube-api-access-vwd7c\") pod \"5e421425-027c-465c-a1f0-e16d1f6f7266\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " Apr 16 18:39:56.679789 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.679415 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-kserve-provision-location\") pod \"5e421425-027c-465c-a1f0-e16d1f6f7266\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " Apr 16 18:39:56.679789 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.679449 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e421425-027c-465c-a1f0-e16d1f6f7266-tls-certs\") pod \"5e421425-027c-465c-a1f0-e16d1f6f7266\" (UID: \"5e421425-027c-465c-a1f0-e16d1f6f7266\") " Apr 16 18:39:56.679789 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.679631 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-model-cache" (OuterVolumeSpecName: "model-cache") pod "5e421425-027c-465c-a1f0-e16d1f6f7266" (UID: "5e421425-027c-465c-a1f0-e16d1f6f7266"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:56.679789 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.679704 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:39:56.679789 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.679711 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-home" (OuterVolumeSpecName: "home") pod "5e421425-027c-465c-a1f0-e16d1f6f7266" (UID: "5e421425-027c-465c-a1f0-e16d1f6f7266"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:56.681747 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.681712 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e421425-027c-465c-a1f0-e16d1f6f7266-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5e421425-027c-465c-a1f0-e16d1f6f7266" (UID: "5e421425-027c-465c-a1f0-e16d1f6f7266"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:39:56.681747 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.681730 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e421425-027c-465c-a1f0-e16d1f6f7266-kube-api-access-vwd7c" (OuterVolumeSpecName: "kube-api-access-vwd7c") pod "5e421425-027c-465c-a1f0-e16d1f6f7266" (UID: "5e421425-027c-465c-a1f0-e16d1f6f7266"). InnerVolumeSpecName "kube-api-access-vwd7c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:39:56.681984 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.681945 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-dshm" (OuterVolumeSpecName: "dshm") pod "5e421425-027c-465c-a1f0-e16d1f6f7266" (UID: "5e421425-027c-465c-a1f0-e16d1f6f7266"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:56.743531 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.743470 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5e421425-027c-465c-a1f0-e16d1f6f7266" (UID: "5e421425-027c-465c-a1f0-e16d1f6f7266"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:56.780405 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.780360 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwd7c\" (UniqueName: \"kubernetes.io/projected/5e421425-027c-465c-a1f0-e16d1f6f7266-kube-api-access-vwd7c\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:39:56.780569 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.780412 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:39:56.780569 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.780430 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e421425-027c-465c-a1f0-e16d1f6f7266-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:39:56.780569 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.780446 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:39:56.780569 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:56.780460 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e421425-027c-465c-a1f0-e16d1f6f7266-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:39:57.159962 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:57.159926 2574 generic.go:358] "Generic (PLEG): container finished" podID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerID="ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a" exitCode=0 Apr 16 18:39:57.160150 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:57.160010 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:39:57.160150 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:57.160015 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"5e421425-027c-465c-a1f0-e16d1f6f7266","Type":"ContainerDied","Data":"ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a"} Apr 16 18:39:57.160150 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:57.160063 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"5e421425-027c-465c-a1f0-e16d1f6f7266","Type":"ContainerDied","Data":"f3fc8c47efe0ad276c3c8b3a9883fb4280d6cc5854eedc5973b8c8a3da14f859"} Apr 16 18:39:57.160150 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:57.160081 2574 scope.go:117] "RemoveContainer" containerID="ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a" Apr 16 18:39:57.185562 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:57.185516 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:39:57.188232 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:57.188198 2574 scope.go:117] "RemoveContainer" containerID="26982bada79faa1089a182cb4428f039e5531aad4507575efacf34075f018106" Apr 16 18:39:57.191125 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:57.191100 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:39:57.254711 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:57.254688 2574 scope.go:117] "RemoveContainer" containerID="ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a" Apr 16 18:39:57.255127 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:39:57.255099 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a\": container with ID starting with ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a not found: ID does not exist" containerID="ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a" Apr 16 18:39:57.255207 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:57.255138 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a"} err="failed to get container status \"ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a\": rpc error: code = NotFound desc = could not find container \"ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a\": container with ID starting with ef1ac48886b3b794fcd36b3520f437df15b89f9004212973ab34d4f55b41460a not found: ID does not exist" Apr 16 18:39:57.255207 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:57.255158 2574 scope.go:117] "RemoveContainer" containerID="26982bada79faa1089a182cb4428f039e5531aad4507575efacf34075f018106" Apr 16 18:39:57.255451 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:39:57.255431 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26982bada79faa1089a182cb4428f039e5531aad4507575efacf34075f018106\": container with ID starting with 26982bada79faa1089a182cb4428f039e5531aad4507575efacf34075f018106 not found: ID does not exist" containerID="26982bada79faa1089a182cb4428f039e5531aad4507575efacf34075f018106" Apr 16 18:39:57.255515 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:57.255458 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26982bada79faa1089a182cb4428f039e5531aad4507575efacf34075f018106"} err="failed to get container status \"26982bada79faa1089a182cb4428f039e5531aad4507575efacf34075f018106\": rpc error: code = NotFound desc = could not find container \"26982bada79faa1089a182cb4428f039e5531aad4507575efacf34075f018106\": container with ID starting with 26982bada79faa1089a182cb4428f039e5531aad4507575efacf34075f018106 not found: ID does not exist" Apr 16 18:39:58.943038 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:39:58.943001 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" path="/var/lib/kubelet/pods/5e421425-027c-465c-a1f0-e16d1f6f7266/volumes" Apr 16 18:40:01.657527 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:01.657483 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:40:01.672239 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:01.672191 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:40:11.657938 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:11.657884 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:40:11.672494 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:11.672440 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:40:21.658344 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:21.658293 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 16 18:40:21.671511 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:21.671455 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 16 18:40:31.668205 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:31.668173 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:40:31.682753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:31.682716 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:40:31.687711 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:31.687684 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:40:31.691190 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:31.691137 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:40:53.990648 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:53.990542 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg"] Apr 16 18:40:53.993546 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:53.990957 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" containerID="cri-o://16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1" gracePeriod=30 Apr 16 18:40:53.993546 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:53.992347 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx"] Apr 16 18:40:53.993546 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:40:53.992723 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" containerID="cri-o://4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424" gracePeriod=30 Apr 16 18:41:15.910118 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910068 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln"] Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910504 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="storage-initializer" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910524 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="storage-initializer" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910541 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910550 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910561 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910570 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910585 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910590 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910599 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="llm-d-routing-sidecar" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910604 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="llm-d-routing-sidecar" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910612 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="storage-initializer" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910617 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="storage-initializer" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910629 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="storage-initializer" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910634 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="storage-initializer" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910715 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="main" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910729 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="158967f4-4093-40cd-9f7c-a987d16c8a1c" containerName="main" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910741 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e421425-027c-465c-a1f0-e16d1f6f7266" containerName="main" Apr 16 18:41:15.910753 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.910751 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fa1bfcf-592e-44ab-ac60-e9ee7850a8a4" containerName="llm-d-routing-sidecar" Apr 16 18:41:15.914079 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.914054 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:15.916682 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.916653 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 18:41:15.926134 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:15.926101 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln"] Apr 16 18:41:16.075421 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.075360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-home\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.075421 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.075428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.075693 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.075510 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.075693 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.075584 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.075693 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.075664 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qbnq\" (UniqueName: \"kubernetes.io/projected/b0fca647-378a-413c-a503-ae433fdfc711-kube-api-access-2qbnq\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.075814 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.075740 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fca647-378a-413c-a503-ae433fdfc711-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.176849 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.176730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.176849 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.176784 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.176849 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.176820 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qbnq\" (UniqueName: \"kubernetes.io/projected/b0fca647-378a-413c-a503-ae433fdfc711-kube-api-access-2qbnq\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.177141 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.176897 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fca647-378a-413c-a503-ae433fdfc711-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.177141 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.176944 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-home\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.177141 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.176973 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.177291 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.177227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.177291 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.177268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-home\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.177388 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.177311 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.179128 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.179109 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.179511 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.179489 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fca647-378a-413c-a503-ae433fdfc711-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.186670 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.186624 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qbnq\" (UniqueName: \"kubernetes.io/projected/b0fca647-378a-413c-a503-ae433fdfc711-kube-api-access-2qbnq\") pod \"router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.227646 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.227600 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:16.363431 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.363377 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln"] Apr 16 18:41:16.368274 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:41:16.368241 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0fca647_378a_413c_a503_ae433fdfc711.slice/crio-fe6e190f5d6c190e5f25ef4142b4b7436dd6534c3a345bdebc4db6ec11e2de96 WatchSource:0}: Error finding container fe6e190f5d6c190e5f25ef4142b4b7436dd6534c3a345bdebc4db6ec11e2de96: Status 404 returned error can't find the container with id fe6e190f5d6c190e5f25ef4142b4b7436dd6534c3a345bdebc4db6ec11e2de96 Apr 16 18:41:16.436162 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.436074 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" event={"ID":"b0fca647-378a-413c-a503-ae433fdfc711","Type":"ContainerStarted","Data":"1e2b0414ac75f72c7de4aa39c8a118722ffefba127e4d8ddcf070a473da21605"} Apr 16 18:41:16.436162 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:16.436113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" event={"ID":"b0fca647-378a-413c-a503-ae433fdfc711","Type":"ContainerStarted","Data":"fe6e190f5d6c190e5f25ef4142b4b7436dd6534c3a345bdebc4db6ec11e2de96"} Apr 16 18:41:21.455214 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:21.455135 2574 generic.go:358] "Generic (PLEG): container finished" podID="b0fca647-378a-413c-a503-ae433fdfc711" containerID="1e2b0414ac75f72c7de4aa39c8a118722ffefba127e4d8ddcf070a473da21605" exitCode=0 Apr 16 18:41:21.455666 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:21.455174 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" event={"ID":"b0fca647-378a-413c-a503-ae433fdfc711","Type":"ContainerDied","Data":"1e2b0414ac75f72c7de4aa39c8a118722ffefba127e4d8ddcf070a473da21605"} Apr 16 18:41:22.460124 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:22.460090 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" event={"ID":"b0fca647-378a-413c-a503-ae433fdfc711","Type":"ContainerStarted","Data":"e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6"} Apr 16 18:41:22.484751 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:22.484700 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" podStartSLOduration=7.484680265 podStartE2EDuration="7.484680265s" podCreationTimestamp="2026-04-16 18:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:41:22.482225433 +0000 UTC m=+1902.137460307" watchObservedRunningTime="2026-04-16 18:41:22.484680265 +0000 UTC m=+1902.139915117" Apr 16 18:41:23.993324 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:23.993243 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="llm-d-routing-sidecar" containerID="cri-o://e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9" gracePeriod=2 Apr 16 18:41:24.280118 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.280090 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:41:24.400671 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.400646 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx_49f3223a-0126-4965-9f3b-99fee62af21a/main/0.log" Apr 16 18:41:24.401476 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.401455 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:41:24.453197 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.453155 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-tls-certs\") pod \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " Apr 16 18:41:24.453197 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.453204 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-home\") pod \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " Apr 16 18:41:24.453457 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.453247 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr99x\" (UniqueName: \"kubernetes.io/projected/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-kube-api-access-mr99x\") pod \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " Apr 16 18:41:24.453457 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.453319 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-dshm\") pod \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " Apr 16 18:41:24.453457 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.453353 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-model-cache\") pod \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " Apr 16 18:41:24.453457 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.453410 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-kserve-provision-location\") pod \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\" (UID: \"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2\") " Apr 16 18:41:24.453693 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.453657 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-home" (OuterVolumeSpecName: "home") pod "b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" (UID: "b5b23a33-a9f2-4d3f-9859-1df515f3b2d2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:24.453915 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.453878 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-model-cache" (OuterVolumeSpecName: "model-cache") pod "b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" (UID: "b5b23a33-a9f2-4d3f-9859-1df515f3b2d2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:24.455739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.455713 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" (UID: "b5b23a33-a9f2-4d3f-9859-1df515f3b2d2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:41:24.455986 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.455966 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-kube-api-access-mr99x" (OuterVolumeSpecName: "kube-api-access-mr99x") pod "b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" (UID: "b5b23a33-a9f2-4d3f-9859-1df515f3b2d2"). InnerVolumeSpecName "kube-api-access-mr99x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:41:24.456086 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.456065 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-dshm" (OuterVolumeSpecName: "dshm") pod "b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" (UID: "b5b23a33-a9f2-4d3f-9859-1df515f3b2d2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:24.468748 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.468716 2574 generic.go:358] "Generic (PLEG): container finished" podID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerID="16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1" exitCode=137 Apr 16 18:41:24.468951 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.468790 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" Apr 16 18:41:24.468951 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.468807 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" event={"ID":"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2","Type":"ContainerDied","Data":"16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1"} Apr 16 18:41:24.468951 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.468877 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg" event={"ID":"b5b23a33-a9f2-4d3f-9859-1df515f3b2d2","Type":"ContainerDied","Data":"e2cc0d18ac04bf6cfd7450dead1f49de43c25330467c5b901ae4bd02c21f8b70"} Apr 16 18:41:24.468951 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.468901 2574 scope.go:117] "RemoveContainer" containerID="16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1" Apr 16 18:41:24.470492 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.470470 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx_49f3223a-0126-4965-9f3b-99fee62af21a/main/0.log" Apr 16 18:41:24.471240 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.471207 2574 generic.go:358] "Generic (PLEG): container finished" podID="49f3223a-0126-4965-9f3b-99fee62af21a" containerID="4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424" exitCode=137 Apr 16 18:41:24.471240 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.471229 2574 generic.go:358] "Generic (PLEG): container finished" podID="49f3223a-0126-4965-9f3b-99fee62af21a" containerID="e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9" exitCode=0 Apr 16 18:41:24.471402 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.471262 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" event={"ID":"49f3223a-0126-4965-9f3b-99fee62af21a","Type":"ContainerDied","Data":"4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424"} Apr 16 18:41:24.471402 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.471294 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" event={"ID":"49f3223a-0126-4965-9f3b-99fee62af21a","Type":"ContainerDied","Data":"e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9"} Apr 16 18:41:24.471402 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.471297 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" Apr 16 18:41:24.471402 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.471310 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx" event={"ID":"49f3223a-0126-4965-9f3b-99fee62af21a","Type":"ContainerDied","Data":"8b1f228860e5a006a77f42117b4dfdf416aaa6519d353a64a9dbfb506315a905"} Apr 16 18:41:24.499923 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.499898 2574 scope.go:117] "RemoveContainer" containerID="013aa86c715dff86b6d800252fba0362e31daf2d4e9af8782a876c896b57cc62" Apr 16 18:41:24.516333 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.516291 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" (UID: "b5b23a33-a9f2-4d3f-9859-1df515f3b2d2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:24.554433 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.554378 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzxld\" (UniqueName: \"kubernetes.io/projected/49f3223a-0126-4965-9f3b-99fee62af21a-kube-api-access-xzxld\") pod \"49f3223a-0126-4965-9f3b-99fee62af21a\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " Apr 16 18:41:24.554617 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.554466 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-dshm\") pod \"49f3223a-0126-4965-9f3b-99fee62af21a\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " Apr 16 18:41:24.554617 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.554560 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-kserve-provision-location\") pod \"49f3223a-0126-4965-9f3b-99fee62af21a\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " Apr 16 18:41:24.554741 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.554658 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-model-cache\") pod \"49f3223a-0126-4965-9f3b-99fee62af21a\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " Apr 16 18:41:24.554741 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.554702 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-home\") pod \"49f3223a-0126-4965-9f3b-99fee62af21a\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " Apr 16 18:41:24.554741 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.554723 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49f3223a-0126-4965-9f3b-99fee62af21a-tls-certs\") pod \"49f3223a-0126-4965-9f3b-99fee62af21a\" (UID: \"49f3223a-0126-4965-9f3b-99fee62af21a\") " Apr 16 18:41:24.555345 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.555013 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mr99x\" (UniqueName: \"kubernetes.io/projected/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-kube-api-access-mr99x\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:41:24.555345 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.555039 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:41:24.555345 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.555055 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:41:24.555345 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.555072 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:41:24.555345 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.555088 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:41:24.555345 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.555103 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:41:24.555345 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.555219 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-home" (OuterVolumeSpecName: "home") pod "49f3223a-0126-4965-9f3b-99fee62af21a" (UID: "49f3223a-0126-4965-9f3b-99fee62af21a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:24.555345 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.555236 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-model-cache" (OuterVolumeSpecName: "model-cache") pod "49f3223a-0126-4965-9f3b-99fee62af21a" (UID: "49f3223a-0126-4965-9f3b-99fee62af21a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:24.557330 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.557212 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f3223a-0126-4965-9f3b-99fee62af21a-kube-api-access-xzxld" (OuterVolumeSpecName: "kube-api-access-xzxld") pod "49f3223a-0126-4965-9f3b-99fee62af21a" (UID: "49f3223a-0126-4965-9f3b-99fee62af21a"). InnerVolumeSpecName "kube-api-access-xzxld". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:41:24.557330 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.557249 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f3223a-0126-4965-9f3b-99fee62af21a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "49f3223a-0126-4965-9f3b-99fee62af21a" (UID: "49f3223a-0126-4965-9f3b-99fee62af21a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:41:24.557330 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.557296 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-dshm" (OuterVolumeSpecName: "dshm") pod "49f3223a-0126-4965-9f3b-99fee62af21a" (UID: "49f3223a-0126-4965-9f3b-99fee62af21a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:24.561704 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.561677 2574 scope.go:117] "RemoveContainer" containerID="16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1" Apr 16 18:41:24.562105 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:41:24.562082 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1\": container with ID starting with 16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1 not found: ID does not exist" containerID="16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1" Apr 16 18:41:24.562213 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.562113 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1"} err="failed to get container status \"16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1\": rpc error: code = NotFound desc = could not find container \"16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1\": container with ID starting with 16c1fa10d1293e8c298f3955afa7c4bd573c51c390895d7ff340d07324d38df1 not found: ID does not exist" Apr 16 18:41:24.562213 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.562134 2574 scope.go:117] "RemoveContainer" containerID="013aa86c715dff86b6d800252fba0362e31daf2d4e9af8782a876c896b57cc62" Apr 16 18:41:24.562460 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:41:24.562426 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013aa86c715dff86b6d800252fba0362e31daf2d4e9af8782a876c896b57cc62\": container with ID starting with 013aa86c715dff86b6d800252fba0362e31daf2d4e9af8782a876c896b57cc62 not found: ID does not exist" containerID="013aa86c715dff86b6d800252fba0362e31daf2d4e9af8782a876c896b57cc62" Apr 16 18:41:24.562577 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.562461 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013aa86c715dff86b6d800252fba0362e31daf2d4e9af8782a876c896b57cc62"} err="failed to get container status \"013aa86c715dff86b6d800252fba0362e31daf2d4e9af8782a876c896b57cc62\": rpc error: code = NotFound desc = could not find container \"013aa86c715dff86b6d800252fba0362e31daf2d4e9af8782a876c896b57cc62\": container with ID starting with 013aa86c715dff86b6d800252fba0362e31daf2d4e9af8782a876c896b57cc62 not found: ID does not exist" Apr 16 18:41:24.562577 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.562484 2574 scope.go:117] "RemoveContainer" containerID="4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424" Apr 16 18:41:24.595211 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.595186 2574 scope.go:117] "RemoveContainer" containerID="76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776" Apr 16 18:41:24.621022 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.620977 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "49f3223a-0126-4965-9f3b-99fee62af21a" (UID: "49f3223a-0126-4965-9f3b-99fee62af21a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:24.655943 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.655909 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:41:24.655943 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.655941 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:41:24.656129 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.655952 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49f3223a-0126-4965-9f3b-99fee62af21a-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:41:24.656129 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.655967 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xzxld\" (UniqueName: \"kubernetes.io/projected/49f3223a-0126-4965-9f3b-99fee62af21a-kube-api-access-xzxld\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:41:24.656129 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.655977 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:41:24.656129 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.655987 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f3223a-0126-4965-9f3b-99fee62af21a-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:41:24.661965 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.661946 2574 scope.go:117] "RemoveContainer" containerID="e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9" Apr 16 18:41:24.671631 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.671607 2574 scope.go:117] "RemoveContainer" containerID="4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424" Apr 16 18:41:24.671986 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:41:24.671964 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424\": container with ID starting with 4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424 not found: ID does not exist" containerID="4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424" Apr 16 18:41:24.672077 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.671996 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424"} err="failed to get container status \"4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424\": rpc error: code = NotFound desc = could not find container \"4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424\": container with ID starting with 4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424 not found: ID does not exist" Apr 16 18:41:24.672077 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.672022 2574 scope.go:117] "RemoveContainer" containerID="76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776" Apr 16 18:41:24.672282 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:41:24.672265 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776\": container with ID starting with 76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776 not found: ID does not exist" containerID="76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776" Apr 16 18:41:24.672320 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.672289 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776"} err="failed to get container status \"76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776\": rpc error: code = NotFound desc = could not find container \"76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776\": container with ID starting with 76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776 not found: ID does not exist" Apr 16 18:41:24.672320 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.672312 2574 scope.go:117] "RemoveContainer" containerID="e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9" Apr 16 18:41:24.672598 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:41:24.672576 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9\": container with ID starting with e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9 not found: ID does not exist" containerID="e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9" Apr 16 18:41:24.672647 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.672604 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9"} err="failed to get container status \"e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9\": rpc error: code = NotFound desc = could not find container \"e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9\": container with ID starting with e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9 not found: ID does not exist" Apr 16 18:41:24.672647 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.672620 2574 scope.go:117] "RemoveContainer" containerID="4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424" Apr 16 18:41:24.672884 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.672864 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424"} err="failed to get container status \"4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424\": rpc error: code = NotFound desc = could not find container \"4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424\": container with ID starting with 4d8746db5716e20a4f8820a5ed40c42da3ea91a7269c4f7ef4e560d7fc7ad424 not found: ID does not exist" Apr 16 18:41:24.672884 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.672885 2574 scope.go:117] "RemoveContainer" containerID="76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776" Apr 16 18:41:24.673128 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.673110 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776"} err="failed to get container status \"76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776\": rpc error: code = NotFound desc = could not find container \"76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776\": container with ID starting with 76957d1fe13a0c6f5b1a9268abdb9c1acacfc9eb7362d1725ec0e9b2316ff776 not found: ID does not exist" Apr 16 18:41:24.673191 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.673129 2574 scope.go:117] "RemoveContainer" containerID="e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9" Apr 16 18:41:24.673349 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.673332 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9"} err="failed to get container status \"e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9\": rpc error: code = NotFound desc = could not find container \"e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9\": container with ID starting with e11ec0c88c7c886af7683b906bd61c5231b5389856face567b721258ce1699f9 not found: ID does not exist" Apr 16 18:41:24.794029 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.793985 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg"] Apr 16 18:41:24.798898 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.798864 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-854f95b6ff-n8nvg"] Apr 16 18:41:24.811398 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.811367 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx"] Apr 16 18:41:24.815748 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.815713 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b746dcc4b-8xcjx"] Apr 16 18:41:24.938407 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.938360 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" path="/var/lib/kubelet/pods/49f3223a-0126-4965-9f3b-99fee62af21a/volumes" Apr 16 18:41:24.938869 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:24.938852 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" path="/var/lib/kubelet/pods/b5b23a33-a9f2-4d3f-9859-1df515f3b2d2/volumes" Apr 16 18:41:26.227847 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:26.227798 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:26.228230 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:26.227864 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:41:26.229389 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:26.229360 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" probeResult="failure" output="Get \"https://10.133.0.42:8000/health\": dial tcp 10.133.0.42:8000: connect: connection refused" Apr 16 18:41:36.228649 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:36.228590 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" probeResult="failure" output="Get \"https://10.133.0.42:8000/health\": dial tcp 10.133.0.42:8000: connect: connection refused" Apr 16 18:41:46.228792 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:46.228745 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" probeResult="failure" output="Get \"https://10.133.0.42:8000/health\": dial tcp 10.133.0.42:8000: connect: connection refused" Apr 16 18:41:56.228431 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:41:56.228387 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" probeResult="failure" output="Get \"https://10.133.0.42:8000/health\": dial tcp 10.133.0.42:8000: connect: connection refused" Apr 16 18:42:06.228321 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:42:06.228265 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" probeResult="failure" output="Get \"https://10.133.0.42:8000/health\": dial tcp 10.133.0.42:8000: connect: connection refused" Apr 16 18:42:16.228264 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:42:16.228215 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" probeResult="failure" output="Get \"https://10.133.0.42:8000/health\": dial tcp 10.133.0.42:8000: connect: connection refused" Apr 16 18:42:26.228568 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:42:26.228477 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" probeResult="failure" output="Get \"https://10.133.0.42:8000/health\": dial tcp 10.133.0.42:8000: connect: connection refused" Apr 16 18:42:36.228213 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:42:36.228164 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" probeResult="failure" output="Get \"https://10.133.0.42:8000/health\": dial tcp 10.133.0.42:8000: connect: connection refused" Apr 16 18:42:46.228226 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:42:46.228175 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" probeResult="failure" output="Get \"https://10.133.0.42:8000/health\": dial tcp 10.133.0.42:8000: connect: connection refused" Apr 16 18:42:56.228709 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:42:56.228660 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" probeResult="failure" output="Get \"https://10.133.0.42:8000/health\": dial tcp 10.133.0.42:8000: connect: connection refused" Apr 16 18:43:06.238713 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:06.238675 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:43:06.246767 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:06.246732 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:43:17.831087 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:17.831045 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln"] Apr 16 18:43:17.831618 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:17.831465 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" containerID="cri-o://e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6" gracePeriod=30 Apr 16 18:43:19.086731 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.086688 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jv4tf/must-gather-v8mxx"] Apr 16 18:43:19.087247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087120 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="storage-initializer" Apr 16 18:43:19.087247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087138 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="storage-initializer" Apr 16 18:43:19.087247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087157 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="llm-d-routing-sidecar" Apr 16 18:43:19.087247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087165 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="llm-d-routing-sidecar" Apr 16 18:43:19.087247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087174 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" Apr 16 18:43:19.087247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087182 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" Apr 16 18:43:19.087247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087192 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" Apr 16 18:43:19.087247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087199 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" Apr 16 18:43:19.087247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087220 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="storage-initializer" Apr 16 18:43:19.087247 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087229 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="storage-initializer" Apr 16 18:43:19.087742 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087308 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b23a33-a9f2-4d3f-9859-1df515f3b2d2" containerName="main" Apr 16 18:43:19.087742 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087322 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="llm-d-routing-sidecar" Apr 16 18:43:19.087742 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.087334 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="49f3223a-0126-4965-9f3b-99fee62af21a" containerName="main" Apr 16 18:43:19.090704 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.090680 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jv4tf/must-gather-v8mxx" Apr 16 18:43:19.093339 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.093314 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jv4tf\"/\"kube-root-ca.crt\"" Apr 16 18:43:19.094503 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.094486 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jv4tf\"/\"openshift-service-ca.crt\"" Apr 16 18:43:19.094503 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.094494 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jv4tf\"/\"default-dockercfg-rkhsn\"" Apr 16 18:43:19.097350 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.097295 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jv4tf/must-gather-v8mxx"] Apr 16 18:43:19.143781 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.143741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/219a0ffb-9ff6-4f91-9495-bfe685811129-must-gather-output\") pod \"must-gather-v8mxx\" (UID: \"219a0ffb-9ff6-4f91-9495-bfe685811129\") " pod="openshift-must-gather-jv4tf/must-gather-v8mxx" Apr 16 18:43:19.143993 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.143801 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntdph\" (UniqueName: \"kubernetes.io/projected/219a0ffb-9ff6-4f91-9495-bfe685811129-kube-api-access-ntdph\") pod \"must-gather-v8mxx\" (UID: \"219a0ffb-9ff6-4f91-9495-bfe685811129\") " pod="openshift-must-gather-jv4tf/must-gather-v8mxx" Apr 16 18:43:19.244375 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.244335 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/219a0ffb-9ff6-4f91-9495-bfe685811129-must-gather-output\") pod \"must-gather-v8mxx\" (UID: \"219a0ffb-9ff6-4f91-9495-bfe685811129\") " pod="openshift-must-gather-jv4tf/must-gather-v8mxx" Apr 16 18:43:19.244551 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.244386 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntdph\" (UniqueName: \"kubernetes.io/projected/219a0ffb-9ff6-4f91-9495-bfe685811129-kube-api-access-ntdph\") pod \"must-gather-v8mxx\" (UID: \"219a0ffb-9ff6-4f91-9495-bfe685811129\") " pod="openshift-must-gather-jv4tf/must-gather-v8mxx" Apr 16 18:43:19.244672 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.244652 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/219a0ffb-9ff6-4f91-9495-bfe685811129-must-gather-output\") pod \"must-gather-v8mxx\" (UID: \"219a0ffb-9ff6-4f91-9495-bfe685811129\") " pod="openshift-must-gather-jv4tf/must-gather-v8mxx" Apr 16 18:43:19.253364 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.253332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntdph\" (UniqueName: \"kubernetes.io/projected/219a0ffb-9ff6-4f91-9495-bfe685811129-kube-api-access-ntdph\") pod \"must-gather-v8mxx\" (UID: \"219a0ffb-9ff6-4f91-9495-bfe685811129\") " pod="openshift-must-gather-jv4tf/must-gather-v8mxx" Apr 16 18:43:19.401637 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.401600 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jv4tf/must-gather-v8mxx" Apr 16 18:43:19.528323 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.528202 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jv4tf/must-gather-v8mxx"] Apr 16 18:43:19.531184 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:43:19.531156 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod219a0ffb_9ff6_4f91_9495_bfe685811129.slice/crio-28e65b0064b8ab7fb5076536a64123cd16c86ceac493f2b62469cc96078edd73 WatchSource:0}: Error finding container 28e65b0064b8ab7fb5076536a64123cd16c86ceac493f2b62469cc96078edd73: Status 404 returned error can't find the container with id 28e65b0064b8ab7fb5076536a64123cd16c86ceac493f2b62469cc96078edd73 Apr 16 18:43:19.533271 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.533243 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:43:19.862016 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:19.861926 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jv4tf/must-gather-v8mxx" event={"ID":"219a0ffb-9ff6-4f91-9495-bfe685811129","Type":"ContainerStarted","Data":"28e65b0064b8ab7fb5076536a64123cd16c86ceac493f2b62469cc96078edd73"} Apr 16 18:43:23.878007 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:23.877967 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jv4tf/must-gather-v8mxx" event={"ID":"219a0ffb-9ff6-4f91-9495-bfe685811129","Type":"ContainerStarted","Data":"5e79a9ddbf7d16582ccbc43461fb9e9f346d868cf46e5a95a5f6bfe178d6a46f"} Apr 16 18:43:23.878007 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:23.878013 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jv4tf/must-gather-v8mxx" event={"ID":"219a0ffb-9ff6-4f91-9495-bfe685811129","Type":"ContainerStarted","Data":"5ce6ee8c5e6a92c99a1a163c44386915b85a60684b85381f5593bd21a1f6bc64"} Apr 16 18:43:23.894714 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:23.894661 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jv4tf/must-gather-v8mxx" podStartSLOduration=0.849791653 podStartE2EDuration="4.894638053s" podCreationTimestamp="2026-04-16 18:43:19 +0000 UTC" firstStartedPulling="2026-04-16 18:43:19.533432764 +0000 UTC m=+2019.188667599" lastFinishedPulling="2026-04-16 18:43:23.578279165 +0000 UTC m=+2023.233513999" observedRunningTime="2026-04-16 18:43:23.894036942 +0000 UTC m=+2023.549271828" watchObservedRunningTime="2026-04-16 18:43:23.894638053 +0000 UTC m=+2023.549872908" Apr 16 18:43:33.604707 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:33.604662 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:33.625410 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:33.625378 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:34.680559 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:34.680519 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:34.689723 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:34.689690 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:35.752660 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:35.752623 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:35.765509 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:35.765471 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:36.819637 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:36.819595 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:36.829492 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:36.829465 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:37.927102 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:37.927060 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:37.938107 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:37.938081 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:38.985003 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:38.984964 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:38.995441 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:38.995411 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:40.046162 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:40.046090 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:40.054995 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:40.054961 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:41.109070 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:41.109041 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:41.121768 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:41.121737 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:42.177021 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:42.176987 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:42.189685 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:42.189658 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:43.231530 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:43.231494 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:43.240861 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:43.240792 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:44.280802 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:44.280767 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:44.290630 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:44.290587 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:45.337316 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:45.337280 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:45.347863 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:45.347811 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:46.547319 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:46.547287 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:46.557933 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:46.557902 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:47.643147 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:47.643116 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/main/0.log" Apr 16 18:43:47.653122 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:47.653093 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln_b0fca647-378a-413c-a503-ae433fdfc711/storage-initializer/0.log" Apr 16 18:43:48.169619 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.169549 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:43:48.293580 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.293549 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-home\") pod \"b0fca647-378a-413c-a503-ae433fdfc711\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " Apr 16 18:43:48.293580 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.293590 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-kserve-provision-location\") pod \"b0fca647-378a-413c-a503-ae433fdfc711\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " Apr 16 18:43:48.293859 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.293621 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qbnq\" (UniqueName: \"kubernetes.io/projected/b0fca647-378a-413c-a503-ae433fdfc711-kube-api-access-2qbnq\") pod \"b0fca647-378a-413c-a503-ae433fdfc711\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " Apr 16 18:43:48.293859 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.293661 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-model-cache\") pod \"b0fca647-378a-413c-a503-ae433fdfc711\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " Apr 16 18:43:48.293859 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.293690 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-dshm\") pod \"b0fca647-378a-413c-a503-ae433fdfc711\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " Apr 16 18:43:48.293859 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.293738 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fca647-378a-413c-a503-ae433fdfc711-tls-certs\") pod \"b0fca647-378a-413c-a503-ae433fdfc711\" (UID: \"b0fca647-378a-413c-a503-ae433fdfc711\") " Apr 16 18:43:48.294419 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.294204 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-model-cache" (OuterVolumeSpecName: "model-cache") pod "b0fca647-378a-413c-a503-ae433fdfc711" (UID: "b0fca647-378a-413c-a503-ae433fdfc711"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:48.294419 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.294372 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-home" (OuterVolumeSpecName: "home") pod "b0fca647-378a-413c-a503-ae433fdfc711" (UID: "b0fca647-378a-413c-a503-ae433fdfc711"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:48.301414 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.295351 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-home\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:43:48.301414 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.295390 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-model-cache\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:43:48.301414 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.296943 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fca647-378a-413c-a503-ae433fdfc711-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b0fca647-378a-413c-a503-ae433fdfc711" (UID: "b0fca647-378a-413c-a503-ae433fdfc711"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:43:48.301918 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.301867 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-dshm" (OuterVolumeSpecName: "dshm") pod "b0fca647-378a-413c-a503-ae433fdfc711" (UID: "b0fca647-378a-413c-a503-ae433fdfc711"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:48.304335 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.304310 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0fca647-378a-413c-a503-ae433fdfc711-kube-api-access-2qbnq" (OuterVolumeSpecName: "kube-api-access-2qbnq") pod "b0fca647-378a-413c-a503-ae433fdfc711" (UID: "b0fca647-378a-413c-a503-ae433fdfc711"). InnerVolumeSpecName "kube-api-access-2qbnq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:43:48.384474 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.384422 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b0fca647-378a-413c-a503-ae433fdfc711" (UID: "b0fca647-378a-413c-a503-ae433fdfc711"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:48.396635 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.396602 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-kserve-provision-location\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:43:48.396635 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.396633 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2qbnq\" (UniqueName: \"kubernetes.io/projected/b0fca647-378a-413c-a503-ae433fdfc711-kube-api-access-2qbnq\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:43:48.396855 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.396644 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0fca647-378a-413c-a503-ae433fdfc711-dshm\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:43:48.396855 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.396655 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fca647-378a-413c-a503-ae433fdfc711-tls-certs\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:43:48.968587 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.968549 2574 generic.go:358] "Generic (PLEG): container finished" podID="b0fca647-378a-413c-a503-ae433fdfc711" containerID="e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6" exitCode=137 Apr 16 18:43:48.969039 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.968632 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" event={"ID":"b0fca647-378a-413c-a503-ae433fdfc711","Type":"ContainerDied","Data":"e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6"} Apr 16 18:43:48.969039 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.968651 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" Apr 16 18:43:48.969039 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.968674 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln" event={"ID":"b0fca647-378a-413c-a503-ae433fdfc711","Type":"ContainerDied","Data":"fe6e190f5d6c190e5f25ef4142b4b7436dd6534c3a345bdebc4db6ec11e2de96"} Apr 16 18:43:48.969039 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.968691 2574 scope.go:117] "RemoveContainer" containerID="e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6" Apr 16 18:43:48.988992 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.988952 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln"] Apr 16 18:43:48.992090 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.991951 2574 scope.go:117] "RemoveContainer" containerID="1e2b0414ac75f72c7de4aa39c8a118722ffefba127e4d8ddcf070a473da21605" Apr 16 18:43:48.992940 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:48.992913 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66786699f5-4zwln"] Apr 16 18:43:49.081849 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:49.081810 2574 scope.go:117] "RemoveContainer" containerID="e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6" Apr 16 18:43:49.082206 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:43:49.082182 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6\": container with ID starting with e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6 not found: ID does not exist" containerID="e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6" Apr 16 18:43:49.082256 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:49.082221 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6"} err="failed to get container status \"e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6\": rpc error: code = NotFound desc = could not find container \"e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6\": container with ID starting with e889345807df294b2fb4cc165dae8dd4cbbafa79b659adda2660b60f557c39e6 not found: ID does not exist" Apr 16 18:43:49.082256 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:49.082241 2574 scope.go:117] "RemoveContainer" containerID="1e2b0414ac75f72c7de4aa39c8a118722ffefba127e4d8ddcf070a473da21605" Apr 16 18:43:49.082519 ip-10-0-142-43 kubenswrapper[2574]: E0416 18:43:49.082497 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e2b0414ac75f72c7de4aa39c8a118722ffefba127e4d8ddcf070a473da21605\": container with ID starting with 1e2b0414ac75f72c7de4aa39c8a118722ffefba127e4d8ddcf070a473da21605 not found: ID does not exist" containerID="1e2b0414ac75f72c7de4aa39c8a118722ffefba127e4d8ddcf070a473da21605" Apr 16 18:43:49.082561 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:49.082531 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e2b0414ac75f72c7de4aa39c8a118722ffefba127e4d8ddcf070a473da21605"} err="failed to get container status \"1e2b0414ac75f72c7de4aa39c8a118722ffefba127e4d8ddcf070a473da21605\": rpc error: code = NotFound desc = could not find container \"1e2b0414ac75f72c7de4aa39c8a118722ffefba127e4d8ddcf070a473da21605\": container with ID starting with 1e2b0414ac75f72c7de4aa39c8a118722ffefba127e4d8ddcf070a473da21605 not found: ID does not exist" Apr 16 18:43:50.456538 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:50.456458 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-svbsx_040142a4-fe8d-4316-ac7a-7e333dc75e50/authorino/0.log" Apr 16 18:43:50.502572 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:50.502546 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-nlvv7_627ff423-8d04-4247-b465-f4eedd171a6f/kuadrant-console-plugin/0.log" Apr 16 18:43:50.941796 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:50.941755 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fca647-378a-413c-a503-ae433fdfc711" path="/var/lib/kubelet/pods/b0fca647-378a-413c-a503-ae433fdfc711/volumes" Apr 16 18:43:51.980594 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:51.980562 2574 generic.go:358] "Generic (PLEG): container finished" podID="219a0ffb-9ff6-4f91-9495-bfe685811129" containerID="5ce6ee8c5e6a92c99a1a163c44386915b85a60684b85381f5593bd21a1f6bc64" exitCode=0 Apr 16 18:43:51.981049 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:51.980634 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jv4tf/must-gather-v8mxx" event={"ID":"219a0ffb-9ff6-4f91-9495-bfe685811129","Type":"ContainerDied","Data":"5ce6ee8c5e6a92c99a1a163c44386915b85a60684b85381f5593bd21a1f6bc64"} Apr 16 18:43:51.981049 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:51.981006 2574 scope.go:117] "RemoveContainer" containerID="5ce6ee8c5e6a92c99a1a163c44386915b85a60684b85381f5593bd21a1f6bc64" Apr 16 18:43:52.286288 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:52.286193 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jv4tf_must-gather-v8mxx_219a0ffb-9ff6-4f91-9495-bfe685811129/gather/0.log" Apr 16 18:43:55.953448 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:55.953413 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-sx24k_15fcda3c-2ebe-475b-bd0f-7c9f1ed74875/global-pull-secret-syncer/0.log" Apr 16 18:43:56.083477 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:56.083445 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-v5jrt_7075ef9f-68cc-485d-bc93-6ebf4ae1fdd9/konnectivity-agent/0.log" Apr 16 18:43:56.155098 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:56.155067 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-43.ec2.internal_d2abeffdf9790c6ff1185a908891d8f2/haproxy/0.log" Apr 16 18:43:57.816198 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:57.816161 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jv4tf/must-gather-v8mxx"] Apr 16 18:43:57.816630 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:57.816394 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-jv4tf/must-gather-v8mxx" podUID="219a0ffb-9ff6-4f91-9495-bfe685811129" containerName="copy" containerID="cri-o://5e79a9ddbf7d16582ccbc43461fb9e9f346d868cf46e5a95a5f6bfe178d6a46f" gracePeriod=2 Apr 16 18:43:57.821316 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:57.821285 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jv4tf/must-gather-v8mxx"] Apr 16 18:43:58.000962 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:58.000936 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jv4tf_must-gather-v8mxx_219a0ffb-9ff6-4f91-9495-bfe685811129/copy/0.log" Apr 16 18:43:58.001288 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:58.001264 2574 generic.go:358] "Generic (PLEG): container finished" podID="219a0ffb-9ff6-4f91-9495-bfe685811129" containerID="5e79a9ddbf7d16582ccbc43461fb9e9f346d868cf46e5a95a5f6bfe178d6a46f" exitCode=143 Apr 16 18:43:58.056940 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:58.056914 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jv4tf_must-gather-v8mxx_219a0ffb-9ff6-4f91-9495-bfe685811129/copy/0.log" Apr 16 18:43:58.057293 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:58.057278 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jv4tf/must-gather-v8mxx" Apr 16 18:43:58.059589 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:58.059564 2574 status_manager.go:895] "Failed to get status for pod" podUID="219a0ffb-9ff6-4f91-9495-bfe685811129" pod="openshift-must-gather-jv4tf/must-gather-v8mxx" err="pods \"must-gather-v8mxx\" is forbidden: User \"system:node:ip-10-0-142-43.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jv4tf\": no relationship found between node 'ip-10-0-142-43.ec2.internal' and this object" Apr 16 18:43:58.182426 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:58.182391 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/219a0ffb-9ff6-4f91-9495-bfe685811129-must-gather-output\") pod \"219a0ffb-9ff6-4f91-9495-bfe685811129\" (UID: \"219a0ffb-9ff6-4f91-9495-bfe685811129\") " Apr 16 18:43:58.182614 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:58.182487 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntdph\" (UniqueName: \"kubernetes.io/projected/219a0ffb-9ff6-4f91-9495-bfe685811129-kube-api-access-ntdph\") pod \"219a0ffb-9ff6-4f91-9495-bfe685811129\" (UID: \"219a0ffb-9ff6-4f91-9495-bfe685811129\") " Apr 16 18:43:58.184739 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:58.184702 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219a0ffb-9ff6-4f91-9495-bfe685811129-kube-api-access-ntdph" (OuterVolumeSpecName: "kube-api-access-ntdph") pod "219a0ffb-9ff6-4f91-9495-bfe685811129" (UID: "219a0ffb-9ff6-4f91-9495-bfe685811129"). InnerVolumeSpecName "kube-api-access-ntdph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:43:58.188800 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:58.188768 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/219a0ffb-9ff6-4f91-9495-bfe685811129-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "219a0ffb-9ff6-4f91-9495-bfe685811129" (UID: "219a0ffb-9ff6-4f91-9495-bfe685811129"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:58.284126 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:58.284084 2574 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/219a0ffb-9ff6-4f91-9495-bfe685811129-must-gather-output\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:43:58.284126 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:58.284116 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ntdph\" (UniqueName: \"kubernetes.io/projected/219a0ffb-9ff6-4f91-9495-bfe685811129-kube-api-access-ntdph\") on node \"ip-10-0-142-43.ec2.internal\" DevicePath \"\"" Apr 16 18:43:58.938879 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:58.938820 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219a0ffb-9ff6-4f91-9495-bfe685811129" path="/var/lib/kubelet/pods/219a0ffb-9ff6-4f91-9495-bfe685811129/volumes" Apr 16 18:43:59.005566 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:59.005532 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jv4tf_must-gather-v8mxx_219a0ffb-9ff6-4f91-9495-bfe685811129/copy/0.log" Apr 16 18:43:59.005928 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:59.005912 2574 scope.go:117] "RemoveContainer" containerID="5e79a9ddbf7d16582ccbc43461fb9e9f346d868cf46e5a95a5f6bfe178d6a46f" Apr 16 18:43:59.006006 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:59.005932 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jv4tf/must-gather-v8mxx" Apr 16 18:43:59.013353 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:43:59.013332 2574 scope.go:117] "RemoveContainer" containerID="5ce6ee8c5e6a92c99a1a163c44386915b85a60684b85381f5593bd21a1f6bc64" Apr 16 18:44:00.349510 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:00.349479 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-svbsx_040142a4-fe8d-4316-ac7a-7e333dc75e50/authorino/0.log" Apr 16 18:44:00.435024 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:00.434999 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-nlvv7_627ff423-8d04-4247-b465-f4eedd171a6f/kuadrant-console-plugin/0.log" Apr 16 18:44:02.111818 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:02.111780 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6r68b_c1a22ad6-2e26-41ba-918c-624abea492fb/node-exporter/0.log" Apr 16 18:44:02.140754 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:02.140726 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6r68b_c1a22ad6-2e26-41ba-918c-624abea492fb/kube-rbac-proxy/0.log" Apr 16 18:44:02.167608 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:02.167581 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6r68b_c1a22ad6-2e26-41ba-918c-624abea492fb/init-textfile/0.log" Apr 16 18:44:02.600159 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:02.600128 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-xwkwz_bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef/prometheus-operator/0.log" Apr 16 18:44:02.628634 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:02.628606 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-xwkwz_bc9d6d92-ba2b-4d9e-b44b-455ceaa353ef/kube-rbac-proxy/0.log" Apr 16 18:44:04.443694 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.443660 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/1.log" Apr 16 18:44:04.448013 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.447991 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-4lsvd_1be4b879-19c7-4497-badb-3f90683cdd48/console-operator/2.log" Apr 16 18:44:04.457390 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.457361 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr"] Apr 16 18:44:04.457658 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.457646 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="storage-initializer" Apr 16 18:44:04.457702 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.457660 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="storage-initializer" Apr 16 18:44:04.457702 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.457673 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="219a0ffb-9ff6-4f91-9495-bfe685811129" containerName="gather" Apr 16 18:44:04.457702 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.457679 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="219a0ffb-9ff6-4f91-9495-bfe685811129" containerName="gather" Apr 16 18:44:04.457702 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.457697 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" Apr 16 18:44:04.457702 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.457702 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" Apr 16 18:44:04.457877 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.457711 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="219a0ffb-9ff6-4f91-9495-bfe685811129" containerName="copy" Apr 16 18:44:04.457877 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.457716 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="219a0ffb-9ff6-4f91-9495-bfe685811129" containerName="copy" Apr 16 18:44:04.457877 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.457761 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="219a0ffb-9ff6-4f91-9495-bfe685811129" containerName="copy" Apr 16 18:44:04.457877 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.457769 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0fca647-378a-413c-a503-ae433fdfc711" containerName="main" Apr 16 18:44:04.457877 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.457775 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="219a0ffb-9ff6-4f91-9495-bfe685811129" containerName="gather" Apr 16 18:44:04.460558 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.460539 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.463146 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.463121 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qmqrm\"/\"kube-root-ca.crt\"" Apr 16 18:44:04.464195 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.464176 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qmqrm\"/\"openshift-service-ca.crt\"" Apr 16 18:44:04.464279 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.464176 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qmqrm\"/\"default-dockercfg-5hdfg\"" Apr 16 18:44:04.468584 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.468558 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr"] Apr 16 18:44:04.538171 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.538126 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f462e470-959c-471d-8104-4362c1412355-podres\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.538171 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.538172 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f462e470-959c-471d-8104-4362c1412355-lib-modules\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.538406 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.538203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89mzh\" (UniqueName: \"kubernetes.io/projected/f462e470-959c-471d-8104-4362c1412355-kube-api-access-89mzh\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.538406 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.538241 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f462e470-959c-471d-8104-4362c1412355-proc\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.538406 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.538289 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f462e470-959c-471d-8104-4362c1412355-sys\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.639290 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.639242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f462e470-959c-471d-8104-4362c1412355-proc\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.639453 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.639312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f462e470-959c-471d-8104-4362c1412355-sys\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.639453 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.639344 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f462e470-959c-471d-8104-4362c1412355-podres\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.639453 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.639376 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f462e470-959c-471d-8104-4362c1412355-lib-modules\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.639453 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.639379 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f462e470-959c-471d-8104-4362c1412355-proc\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.639453 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.639420 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89mzh\" (UniqueName: \"kubernetes.io/projected/f462e470-959c-471d-8104-4362c1412355-kube-api-access-89mzh\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.639453 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.639437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f462e470-959c-471d-8104-4362c1412355-sys\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.639658 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.639508 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f462e470-959c-471d-8104-4362c1412355-podres\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.639658 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.639509 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f462e470-959c-471d-8104-4362c1412355-lib-modules\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.647917 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.647884 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89mzh\" (UniqueName: \"kubernetes.io/projected/f462e470-959c-471d-8104-4362c1412355-kube-api-access-89mzh\") pod \"perf-node-gather-daemonset-td4mr\" (UID: \"f462e470-959c-471d-8104-4362c1412355\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.771231 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.771140 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:04.892910 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:04.892862 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr"] Apr 16 18:44:04.898123 ip-10-0-142-43 kubenswrapper[2574]: W0416 18:44:04.898083 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf462e470_959c_471d_8104_4362c1412355.slice/crio-ab9115f7332c91840e883f2f7efeffe65aa742d38ba36e79e5d495bf252e1bc7 WatchSource:0}: Error finding container ab9115f7332c91840e883f2f7efeffe65aa742d38ba36e79e5d495bf252e1bc7: Status 404 returned error can't find the container with id ab9115f7332c91840e883f2f7efeffe65aa742d38ba36e79e5d495bf252e1bc7 Apr 16 18:44:05.027887 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:05.027849 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" event={"ID":"f462e470-959c-471d-8104-4362c1412355","Type":"ContainerStarted","Data":"ab9115f7332c91840e883f2f7efeffe65aa742d38ba36e79e5d495bf252e1bc7"} Apr 16 18:44:06.032435 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:06.032392 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" event={"ID":"f462e470-959c-471d-8104-4362c1412355","Type":"ContainerStarted","Data":"ab55dbae86b9f479e31afc0ea7dd404981561ab7d8af058b08113a58a5ab9212"} Apr 16 18:44:06.032857 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:06.032544 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:06.049478 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:06.049426 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" podStartSLOduration=2.04941119 podStartE2EDuration="2.04941119s" podCreationTimestamp="2026-04-16 18:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:44:06.048596338 +0000 UTC m=+2065.703831202" watchObservedRunningTime="2026-04-16 18:44:06.04941119 +0000 UTC m=+2065.704646043" Apr 16 18:44:06.199331 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:06.199302 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mj8zq_befca98d-bc99-4ce7-82eb-9f84457dc655/dns/0.log" Apr 16 18:44:06.222900 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:06.222868 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mj8zq_befca98d-bc99-4ce7-82eb-9f84457dc655/kube-rbac-proxy/0.log" Apr 16 18:44:06.275416 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:06.275385 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b7tsj_25e589f1-86e1-42cd-a623-02b4361d82ee/dns-node-resolver/0.log" Apr 16 18:44:06.752742 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:06.752701 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-545cbcd4f-txvcv_1af7ee53-3f96-4f7d-8957-898bf7c0c8e9/registry/0.log" Apr 16 18:44:06.828187 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:06.828158 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fd7p9_8e7dc651-2c6d-4d3f-912d-c2f49dfd76a0/node-ca/0.log" Apr 16 18:44:08.182344 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:08.182310 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2zvsx_e4c20834-fffd-49b6-be94-da4be1bc80a8/serve-healthcheck-canary/0.log" Apr 16 18:44:08.834142 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:08.834114 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rl96g_b0990fea-8fdf-473f-ab80-def726bcd0aa/kube-rbac-proxy/0.log" Apr 16 18:44:08.860772 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:08.860746 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rl96g_b0990fea-8fdf-473f-ab80-def726bcd0aa/exporter/0.log" Apr 16 18:44:08.884605 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:08.884576 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rl96g_b0990fea-8fdf-473f-ab80-def726bcd0aa/extractor/0.log" Apr 16 18:44:11.373661 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:11.373631 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-65bdb464b4-w7ldw_6bcf994e-a332-4d30-b965-6c5ddd6d6fa6/manager/0.log" Apr 16 18:44:12.046660 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:12.046634 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-td4mr" Apr 16 18:44:18.997105 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:18.997075 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lwp8t_a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9/kube-multus-additional-cni-plugins/0.log" Apr 16 18:44:19.023027 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:19.022993 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lwp8t_a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9/egress-router-binary-copy/0.log" Apr 16 18:44:19.048254 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:19.048229 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lwp8t_a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9/cni-plugins/0.log" Apr 16 18:44:19.072956 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:19.072928 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lwp8t_a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9/bond-cni-plugin/0.log" Apr 16 18:44:19.096558 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:19.096530 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lwp8t_a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9/routeoverride-cni/0.log" Apr 16 18:44:19.120221 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:19.120191 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lwp8t_a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9/whereabouts-cni-bincopy/0.log" Apr 16 18:44:19.145156 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:19.145122 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lwp8t_a2b2c2ec-3dc4-4bfe-9e87-dd6fa7f7edc9/whereabouts-cni/0.log" Apr 16 18:44:19.178270 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:19.178247 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t2zvc_0fe74f00-c50b-4f93-a926-43b61e8e6182/kube-multus/0.log" Apr 16 18:44:19.295281 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:19.295197 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kgtvr_182ef3ca-8527-40a2-b1a7-c714bd3509c5/network-metrics-daemon/0.log" Apr 16 18:44:19.319147 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:19.319120 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kgtvr_182ef3ca-8527-40a2-b1a7-c714bd3509c5/kube-rbac-proxy/0.log" Apr 16 18:44:20.787429 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:20.787400 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-controller/0.log" Apr 16 18:44:20.808062 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:20.808032 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/0.log" Apr 16 18:44:20.817975 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:20.817942 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/ovn-acl-logging/1.log" Apr 16 18:44:20.839985 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:20.839958 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/kube-rbac-proxy-node/0.log" Apr 16 18:44:20.868605 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:20.868574 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:44:20.891778 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:20.891745 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/northd/0.log" Apr 16 18:44:20.914249 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:20.914220 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/nbdb/0.log" Apr 16 18:44:20.938214 ip-10-0-142-43 kubenswrapper[2574]: I0416 18:44:20.938188 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt28v_d40f8597-4c6c-46ba-9f26-4cea171429a6/sbdb/0.log"