Apr 22 18:46:53.430822 ip-10-0-136-53 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:46:53.901795 ip-10-0-136-53 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:53.901795 ip-10-0-136-53 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:46:53.901795 ip-10-0-136-53 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:53.901795 ip-10-0-136-53 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:46:53.901795 ip-10-0-136-53 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:53.903922 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.903820 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:46:53.906388 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906370 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:53.906388 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906388 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906392 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906396 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906399 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906403 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906406 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906409 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906412 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906415 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906418 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906421 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906423 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906426 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906429 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906431 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906434 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906436 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906439 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906441 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:53.906454 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906444 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906447 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906450 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906453 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906455 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906458 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906461 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906464 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906467 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906470 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906472 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906475 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906478 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906499 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906502 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906505 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906507 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906510 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906514 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906518 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:53.906937 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906522 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906525 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906528 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906530 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906533 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906535 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906539 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906541 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906544 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906547 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906550 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906554 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906557 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906560 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906563 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906567 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906570 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906574 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906577 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:53.907453 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906581 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906583 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906586 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906589 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906592 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906595 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906597 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906600 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906603 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906605 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906608 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906610 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906613 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906616 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906619 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906621 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906624 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906628 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906630 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906635 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:53.907934 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906637 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906640 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906642 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906645 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906647 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906649 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.906652 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907063 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907070 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907074 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907077 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907080 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907083 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907086 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907088 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907091 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907094 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907096 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907099 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907102 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:53.908409 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907104 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907107 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907109 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907112 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907115 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907117 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907120 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907123 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907125 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907128 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907130 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907133 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907136 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907139 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907141 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907144 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907146 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907149 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907151 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:53.908926 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907154 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907158 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907161 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907163 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907166 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907168 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907171 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907173 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907176 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907178 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907181 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907183 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907186 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907188 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907191 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907193 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907196 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907198 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907201 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907203 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:53.909657 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907206 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907208 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907211 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907213 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907216 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907218 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907221 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907223 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907226 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907229 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907231 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907234 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907237 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907240 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907243 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907245 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907248 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907252 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907291 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:53.910222 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907295 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907298 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907301 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907304 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907307 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907310 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907312 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907315 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907318 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907321 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907324 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907327 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907329 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907332 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.907336 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908281 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908291 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908298 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908303 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908308 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908311 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908316 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:46:53.910812 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908321 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908324 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908327 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908332 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908336 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908340 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908343 2579 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908346 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908349 2579 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908352 2579 flags.go:64] FLAG: --cloud-config="" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908355 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908358 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908362 2579 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908365 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908368 2579 flags.go:64] FLAG: --config-dir="" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908371 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908375 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908379 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908382 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908385 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908389 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908392 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908396 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908399 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908402 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:46:53.911365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908406 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908411 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908414 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908417 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908420 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908424 2579 flags.go:64] FLAG: --enable-server="true" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908427 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908432 2579 flags.go:64] FLAG: --event-burst="100" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908435 2579 flags.go:64] FLAG: --event-qps="50" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908438 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908442 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908445 2579 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908449 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908452 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908456 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908459 2579 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908463 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908465 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908468 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908472 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908475 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908478 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908493 2579 flags.go:64] FLAG: --feature-gates="" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908497 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908501 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:46:53.911991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908505 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908508 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908511 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908515 2579 flags.go:64] FLAG: --help="false" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908517 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-136-53.ec2.internal" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908521 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908524 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908527 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908531 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908534 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908537 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908541 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908543 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908547 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908550 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908553 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908556 2579 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908560 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908563 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908566 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908569 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908572 2579 flags.go:64] FLAG: --lock-file="" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908575 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908578 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:46:53.912621 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908581 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908586 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908589 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908592 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908595 2579 flags.go:64] FLAG: --logging-format="text" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908598 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908602 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908604 2579 flags.go:64] FLAG: --manifest-url="" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908607 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908612 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908615 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908618 2579 flags.go:64] FLAG: --max-pods="110" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908622 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908625 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908628 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908632 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908634 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908638 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908641 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908650 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908654 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908657 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908660 2579 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:46:53.913201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908663 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908669 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908671 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908675 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908678 2579 flags.go:64] FLAG: --port="10250" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908681 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908684 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06c08dc60c8beb3a1" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908687 2579 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908690 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908693 2579 flags.go:64] FLAG: --register-node="true" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908696 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908699 2579 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908703 2579 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908706 2579 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908709 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908712 2579 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908716 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908719 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908722 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908725 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908728 2579 flags.go:64] FLAG: --runonce="false" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908731 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908734 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908737 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908740 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908743 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:46:53.913779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908746 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908750 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908753 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908756 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908759 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908761 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908764 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908768 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908770 2579 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908773 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908779 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908782 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908792 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908796 2579 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908799 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908801 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908804 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908807 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908810 2579 flags.go:64] FLAG: --v="2" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908815 2579 flags.go:64] FLAG: --version="false" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908819 2579 flags.go:64] FLAG: --vmodule="" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908823 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.908827 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909186 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:53.914410 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909208 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909214 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909219 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909224 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909229 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909233 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909244 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909248 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909252 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909256 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909262 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909271 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909276 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909280 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909284 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909288 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909292 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909298 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909304 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909314 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:53.915029 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909320 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909325 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909330 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909334 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909339 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909343 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909348 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909352 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909356 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909360 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909364 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909373 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909378 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909383 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909387 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909392 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909396 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909400 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909404 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909409 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:53.915590 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909413 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909417 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909421 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909428 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909437 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909442 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909447 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909451 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909455 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909460 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909464 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909468 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909476 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909497 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909502 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909506 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909514 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909519 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909523 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909527 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:53.916071 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909532 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909536 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909540 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909544 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909548 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909553 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909558 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909562 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909566 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909575 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909579 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909584 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909588 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909592 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909597 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909603 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909608 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909612 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909616 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:53.916575 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909620 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:53.917073 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909624 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:53.917073 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909629 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:53.917073 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909637 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:53.917073 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909641 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:53.917073 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.909648 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:53.917073 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.910847 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:53.917919 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.917900 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:46:53.917961 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.917920 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:46:53.917990 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.917971 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:53.917990 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.917976 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:53.917990 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.917979 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:53.917990 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.917983 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:53.917990 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.917986 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:53.917990 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.917989 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:53.917990 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.917992 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.917996 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.917998 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918001 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918004 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918007 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918010 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918012 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918015 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918018 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918021 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918023 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918026 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918029 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918032 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918036 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918038 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918041 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918043 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:53.918170 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918046 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918049 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918051 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918054 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918056 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918059 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918062 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918065 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918067 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918070 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918072 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918076 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918080 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918083 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918087 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918091 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918094 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918097 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918100 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:53.918664 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918103 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918105 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918108 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918111 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918114 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918116 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918119 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918122 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918124 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918127 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918129 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918132 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918134 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918137 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918140 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918142 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918145 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918147 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918150 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:53.919127 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918153 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918157 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918160 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918163 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918165 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918168 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918170 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918173 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918176 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918179 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918183 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918185 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918189 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918191 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918194 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918197 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918199 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918202 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918204 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918207 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:53.919631 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918209 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918212 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918214 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.918220 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918321 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918327 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918329 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918332 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918335 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918338 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918340 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918343 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918346 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918348 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918351 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918353 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:53.920113 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918355 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918359 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918362 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918364 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918367 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918370 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918373 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918375 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918379 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918381 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918384 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918387 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918390 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918392 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918395 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918398 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918401 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918404 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:53.920515 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918407 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918410 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918412 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918415 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918418 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918420 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918423 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918426 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918428 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918431 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918433 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918436 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918438 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918441 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918444 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918446 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918449 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918451 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918454 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918457 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:53.920963 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918460 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918463 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918465 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918468 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918470 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918473 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918476 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918479 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918503 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918506 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918508 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918511 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918513 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918516 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918519 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918522 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918525 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918528 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918530 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918533 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:53.921448 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918535 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918538 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918540 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918543 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918545 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918548 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918551 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918554 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918558 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918561 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918564 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918567 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918570 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918573 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918575 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:53.921948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:53.918578 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:53.922318 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.918583 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:53.922318 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.919429 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:46:53.924668 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.924652 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:46:53.925678 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.925667 2579 server.go:1019] "Starting client certificate rotation" Apr 22 18:46:53.925790 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.925771 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:53.925851 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.925824 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:53.954447 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.954412 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:53.959261 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.959238 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:53.972165 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.972142 2579 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:46:53.982032 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.982008 2579 log.go:25] "Validated CRI v1 image API" Apr 22 18:46:53.983418 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.983402 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:46:53.987152 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.987130 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 938a124c-0481-4399-ab5b-3e9f29b69b44:/dev/nvme0n1p4 a51efdc4-663a-4920-b906-c42d9580bc33:/dev/nvme0n1p3] Apr 22 18:46:53.987201 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.987152 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:46:53.993020 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.992898 2579 manager.go:217] Machine: {Timestamp:2026-04-22 18:46:53.990946877 +0000 UTC m=+0.439729198 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098395 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29ff989974f2d0a72e1cb71b3c109f SystemUUID:ec29ff98-9974-f2d0-a72e-1cb71b3c109f BootID:e9f4358e-c2ec-4e30-9f0c-4a55d9c46dae Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e2:fd:2d:c5:3b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e2:fd:2d:c5:3b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:8b:cf:8a:d6:19 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:46:53.993020 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.993012 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:46:53.993132 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.993098 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:46:53.994951 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.994919 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:46:53.995102 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.994953 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-53.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:46:53.995185 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.995111 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:46:53.995185 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.995119 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:46:53.995185 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.995133 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:53.995354 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.995336 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:53.995841 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.995828 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:53.997429 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.997417 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:53.997794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:53.997783 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:46:54.000299 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.000288 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:46:54.000347 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.000304 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:46:54.000347 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.000320 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:46:54.000347 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.000331 2579 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:46:54.000347 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.000341 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:46:54.001460 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.001447 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:54.001521 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.001468 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:54.004644 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.004625 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:46:54.006039 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.006025 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:46:54.007966 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.007951 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:46:54.007966 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.007968 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:46:54.008068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.007974 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:46:54.008068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.007980 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:46:54.008068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.007986 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:46:54.008068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.007991 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:46:54.008068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.007997 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:46:54.008068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.008007 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:46:54.008068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.008015 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:46:54.008068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.008020 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:46:54.008068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.008033 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:46:54.008068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.008042 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:46:54.008883 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.008870 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:46:54.008883 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.008882 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:46:54.010211 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.010189 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fz662" Apr 22 18:46:54.012543 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.012526 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-53.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:46:54.013005 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.012989 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:46:54.013082 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.013027 2579 server.go:1295] "Started kubelet" Apr 22 18:46:54.013316 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.013276 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:46:54.013379 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.013334 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:46:54.013729 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.013704 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:46:54.013729 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.013707 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-53.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:46:54.013841 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.013774 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:46:54.014080 ip-10-0-136-53 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:46:54.014631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.014498 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:46:54.016555 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.016541 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:46:54.017168 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.017149 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fz662" Apr 22 18:46:54.019465 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.018458 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-53.ec2.internal.18a8c23b8c750ed6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-53.ec2.internal,UID:ip-10-0-136-53.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-53.ec2.internal,},FirstTimestamp:2026-04-22 18:46:54.01300143 +0000 UTC m=+0.461783751,LastTimestamp:2026-04-22 18:46:54.01300143 +0000 UTC m=+0.461783751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-53.ec2.internal,}" Apr 22 18:46:54.021115 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.021098 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:46:54.021191 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.021115 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:54.023473 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.023309 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:46:54.023584 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.023478 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:46:54.023584 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.023534 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:54.024861 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.023298 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:46:54.024861 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.023853 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:46:54.024861 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.023862 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:46:54.025508 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.025478 2579 factory.go:55] Registering systemd factory Apr 22 18:46:54.025999 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.025976 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-136-53.ec2.internal\" not found" node="ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.026181 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.026162 2579 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:46:54.026340 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.026327 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:54.027462 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.027448 2579 factory.go:153] Registering CRI-O factory Apr 22 18:46:54.027556 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.027466 2579 factory.go:223] Registration of the crio container factory successfully Apr 22 18:46:54.027556 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.027538 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:46:54.027665 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.027556 2579 factory.go:103] Registering Raw factory Apr 22 18:46:54.027665 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.027586 2579 manager.go:1196] Started watching for new ooms in manager Apr 22 18:46:54.027751 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.027664 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:46:54.027952 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.027938 2579 manager.go:319] Starting recovery of all containers Apr 22 18:46:54.037297 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.037179 2579 manager.go:324] Recovery completed Apr 22 18:46:54.041309 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.041293 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:54.044312 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.044295 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:54.044377 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.044325 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:54.044377 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.044336 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:54.044882 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.044862 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:46:54.044882 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.044878 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:46:54.044975 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.044896 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:54.047511 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.047498 2579 policy_none.go:49] "None policy: Start" Apr 22 18:46:54.047557 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.047515 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:46:54.047557 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.047525 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:46:54.085195 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.085177 2579 manager.go:341] "Starting Device Plugin manager" Apr 22 18:46:54.099328 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.085228 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:46:54.099328 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.085241 2579 server.go:85] "Starting device plugin registration server" Apr 22 18:46:54.099328 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.085532 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:46:54.099328 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.085544 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:46:54.099328 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.085725 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:46:54.099328 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.085805 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:46:54.099328 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.085813 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:46:54.099328 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.086270 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:46:54.099328 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.086313 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:54.157596 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.157504 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:46:54.158764 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.158744 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:46:54.158827 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.158781 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:46:54.158827 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.158805 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:46:54.158827 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.158814 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:46:54.158926 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.158858 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:46:54.161159 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.161136 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:54.186059 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.186013 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:54.187218 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.187194 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:54.187342 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.187232 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:54.187342 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.187248 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:54.187342 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.187277 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.195127 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.195107 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.195199 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.195153 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-53.ec2.internal\": node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:54.214637 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.214613 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:54.259637 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.259603 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-53.ec2.internal"] Apr 22 18:46:54.259704 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.259689 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:54.263216 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.263197 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:54.263322 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.263227 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:54.263322 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.263237 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:54.264553 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.264538 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:54.264710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.264693 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.264771 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.264731 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:54.265276 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.265262 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:54.265276 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.265269 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:54.265390 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.265287 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:54.265390 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.265289 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:54.265390 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.265301 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:54.265390 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.265308 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:54.266571 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.266558 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.266631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.266580 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:54.267241 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.267218 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:54.267325 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.267248 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:54.267325 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.267259 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:54.296378 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.296349 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-53.ec2.internal\" not found" node="ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.300963 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.300944 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-53.ec2.internal\" not found" node="ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.314937 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.314906 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:54.326429 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.326388 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/890c59873ddf29c2d5fdc1745f24f9dc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal\" (UID: \"890c59873ddf29c2d5fdc1745f24f9dc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.326572 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.326434 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/890c59873ddf29c2d5fdc1745f24f9dc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal\" (UID: \"890c59873ddf29c2d5fdc1745f24f9dc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.326572 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.326457 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/837597ad9723f3b9499e99b045f4e6a9-config\") pod \"kube-apiserver-proxy-ip-10-0-136-53.ec2.internal\" (UID: \"837597ad9723f3b9499e99b045f4e6a9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.415834 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.415747 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:54.427258 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.427227 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/890c59873ddf29c2d5fdc1745f24f9dc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal\" (UID: \"890c59873ddf29c2d5fdc1745f24f9dc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.427341 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.427267 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/890c59873ddf29c2d5fdc1745f24f9dc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal\" (UID: \"890c59873ddf29c2d5fdc1745f24f9dc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.427341 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.427294 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/837597ad9723f3b9499e99b045f4e6a9-config\") pod \"kube-apiserver-proxy-ip-10-0-136-53.ec2.internal\" (UID: \"837597ad9723f3b9499e99b045f4e6a9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.427406 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.427342 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/890c59873ddf29c2d5fdc1745f24f9dc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal\" (UID: \"890c59873ddf29c2d5fdc1745f24f9dc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.427406 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.427362 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/890c59873ddf29c2d5fdc1745f24f9dc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal\" (UID: \"890c59873ddf29c2d5fdc1745f24f9dc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.427467 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.427412 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/837597ad9723f3b9499e99b045f4e6a9-config\") pod \"kube-apiserver-proxy-ip-10-0-136-53.ec2.internal\" (UID: \"837597ad9723f3b9499e99b045f4e6a9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.516717 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.516672 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:54.598342 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.598305 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.603955 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.603934 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-53.ec2.internal" Apr 22 18:46:54.617711 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.617688 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:54.718551 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.718423 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:54.819111 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.819073 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:54.919979 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:54.919949 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:54.925312 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.925292 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:46:54.925444 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.925429 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:54.925480 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:54.925462 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:55.019281 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.019242 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:41:54 +0000 UTC" deadline="2028-01-09 23:47:29.106999722 +0000 UTC" Apr 22 18:46:55.019281 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.019273 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15053h0m34.087728513s" Apr 22 18:46:55.020309 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:55.020290 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:55.021710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.021682 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:55.030261 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.030237 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:55.051813 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.051775 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wt4gg" Apr 22 18:46:55.057018 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.056997 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wt4gg" Apr 22 18:46:55.110628 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:55.110585 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod837597ad9723f3b9499e99b045f4e6a9.slice/crio-9b160c11311e78a9ce0837d25f839de631c8b567d8eaa1a94926363f10f27870 WatchSource:0}: Error finding container 9b160c11311e78a9ce0837d25f839de631c8b567d8eaa1a94926363f10f27870: Status 404 returned error can't find the container with id 9b160c11311e78a9ce0837d25f839de631c8b567d8eaa1a94926363f10f27870 Apr 22 18:46:55.110871 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:55.110843 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod890c59873ddf29c2d5fdc1745f24f9dc.slice/crio-3d34bab2bfc4196490ed2317a45c0571107300acd4cad0e91acd2a9f465b48e6 WatchSource:0}: Error finding container 3d34bab2bfc4196490ed2317a45c0571107300acd4cad0e91acd2a9f465b48e6: Status 404 returned error can't find the container with id 3d34bab2bfc4196490ed2317a45c0571107300acd4cad0e91acd2a9f465b48e6 Apr 22 18:46:55.114412 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.114396 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:46:55.121143 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:55.121114 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:55.128784 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.128767 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:55.162426 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.162374 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-53.ec2.internal" event={"ID":"837597ad9723f3b9499e99b045f4e6a9","Type":"ContainerStarted","Data":"9b160c11311e78a9ce0837d25f839de631c8b567d8eaa1a94926363f10f27870"} Apr 22 18:46:55.163334 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.163315 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" event={"ID":"890c59873ddf29c2d5fdc1745f24f9dc","Type":"ContainerStarted","Data":"3d34bab2bfc4196490ed2317a45c0571107300acd4cad0e91acd2a9f465b48e6"} Apr 22 18:46:55.221596 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:55.221554 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:55.322170 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:55.322072 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:55.422734 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:55.422684 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:55.523658 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:55.523620 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-53.ec2.internal\" not found" Apr 22 18:46:55.563259 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.563226 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:55.622102 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.622015 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-53.ec2.internal" Apr 22 18:46:55.631791 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.631678 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:55.632714 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.632692 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" Apr 22 18:46:55.640800 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:55.640778 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:56.001147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.001115 2579 apiserver.go:52] "Watching apiserver" Apr 22 18:46:56.007450 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.007423 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:46:56.010217 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.010190 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-d76kb","openshift-multus/multus-additional-cni-plugins-b6nbf","openshift-multus/network-metrics-daemon-z69xm","openshift-network-diagnostics/network-check-target-fsnhs","openshift-network-operator/iptables-alerter-9lgpr","kube-system/kube-apiserver-proxy-ip-10-0-136-53.ec2.internal","openshift-image-registry/node-ca-whszt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal","openshift-multus/multus-ssc8m","openshift-ovn-kubernetes/ovnkube-node-tcrqv","kube-system/konnectivity-agent-j98vd","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw","openshift-cluster-node-tuning-operator/tuned-pzgs4"] Apr 22 18:46:56.011708 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.011682 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-whszt" Apr 22 18:46:56.012852 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.012832 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.014104 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.014001 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:46:56.014104 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.014091 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:46:56.014255 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.014129 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:46:56.014255 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.014150 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wg8t6\"" Apr 22 18:46:56.014355 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.014283 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:46:56.014408 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.014393 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:46:56.014722 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.014706 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:46:56.015136 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.015120 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:46:56.015219 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.015181 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:46:56.015340 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.015322 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:46:56.015442 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.015349 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:46:56.015583 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.015568 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:46:56.015658 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.015622 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:46:56.015846 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.015827 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zh5qh\"" Apr 22 18:46:56.016321 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.016304 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9lgpr" Apr 22 18:46:56.019452 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.018801 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d76kb" Apr 22 18:46:56.019452 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.018840 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:56.019452 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.018967 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-j7zqv\"" Apr 22 18:46:56.019452 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.019354 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:46:56.019452 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.019403 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:56.020738 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.020717 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.021223 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.021201 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bnk4n\"" Apr 22 18:46:56.021361 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.021339 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:46:56.021691 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.021674 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:46:56.022265 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.022248 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.022848 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.022780 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:46:56.022945 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.022880 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pkl5k\"" Apr 22 18:46:56.023547 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.023468 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j98vd" Apr 22 18:46:56.024972 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.024955 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.025533 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.025514 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:46:56.025625 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.025571 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:46:56.025679 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.025574 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:46:56.025679 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.025629 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kbwq8\"" Apr 22 18:46:56.025845 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.025828 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:46:56.025910 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.025891 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:46:56.025910 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.025898 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:46:56.026009 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.025926 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:46:56.026059 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.026015 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:46:56.026255 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.026240 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mgd5m\"" Apr 22 18:46:56.027037 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.027019 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:46:56.027467 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.027429 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.027584 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.027542 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:46:56.027655 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.027635 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rqfjz\"" Apr 22 18:46:56.027892 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.027745 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:46:56.029585 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.029567 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dn7rk\"" Apr 22 18:46:56.029709 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.029691 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:56.029856 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.029768 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:56.036234 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036211 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-multus-conf-dir\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.036353 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036248 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-slash\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.036353 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036274 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e741af54-1487-45cb-b5ea-22577783221e-env-overrides\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.036353 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036301 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-etc-selinux\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.036521 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036351 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66lmd\" (UniqueName: \"kubernetes.io/projected/1935b284-bdc2-4d26-b763-4142392b2f07-kube-api-access-66lmd\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.036521 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036381 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:46:56.036521 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036405 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b919031c-5663-434c-ad1b-7f33fd17b752-system-cni-dir\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.036521 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036460 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-hostroot\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.036521 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036478 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-etc-kubernetes\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.036521 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036517 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b919031c-5663-434c-ad1b-7f33fd17b752-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.036786 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036544 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/84618b00-ccd9-4eef-b5aa-c79f222e29a9-multus-daemon-config\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.036786 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036566 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz5zv\" (UniqueName: \"kubernetes.io/projected/84618b00-ccd9-4eef-b5aa-c79f222e29a9-kube-api-access-dz5zv\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.036786 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036585 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-run-openvswitch\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.036786 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036609 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e741af54-1487-45cb-b5ea-22577783221e-ovn-node-metrics-cert\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.036786 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036647 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-sys-fs\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.036786 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036667 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d751b5d6-3a64-4cdd-addf-8bc01e1ef230-serviceca\") pod \"node-ca-whszt\" (UID: \"d751b5d6-3a64-4cdd-addf-8bc01e1ef230\") " pod="openshift-image-registry/node-ca-whszt" Apr 22 18:46:56.036786 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036690 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjck2\" (UniqueName: \"kubernetes.io/projected/d751b5d6-3a64-4cdd-addf-8bc01e1ef230-kube-api-access-hjck2\") pod \"node-ca-whszt\" (UID: \"d751b5d6-3a64-4cdd-addf-8bc01e1ef230\") " pod="openshift-image-registry/node-ca-whszt" Apr 22 18:46:56.036786 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036744 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jjdd\" (UniqueName: \"kubernetes.io/projected/e9c820a3-1c2e-4d37-bfc4-d18d772642d3-kube-api-access-6jjdd\") pod \"iptables-alerter-9lgpr\" (UID: \"e9c820a3-1c2e-4d37-bfc4-d18d772642d3\") " pod="openshift-network-operator/iptables-alerter-9lgpr" Apr 22 18:46:56.036786 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036783 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b919031c-5663-434c-ad1b-7f33fd17b752-cnibin\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.037147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036829 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-cni-bin\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.037147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036860 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-socket-dir\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.037147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e9c820a3-1c2e-4d37-bfc4-d18d772642d3-iptables-alerter-script\") pod \"iptables-alerter-9lgpr\" (UID: \"e9c820a3-1c2e-4d37-bfc4-d18d772642d3\") " pod="openshift-network-operator/iptables-alerter-9lgpr" Apr 22 18:46:56.037147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036909 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9c820a3-1c2e-4d37-bfc4-d18d772642d3-host-slash\") pod \"iptables-alerter-9lgpr\" (UID: \"e9c820a3-1c2e-4d37-bfc4-d18d772642d3\") " pod="openshift-network-operator/iptables-alerter-9lgpr" Apr 22 18:46:56.037147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036944 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-cnibin\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.037147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036967 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-multus-socket-dir-parent\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.037147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.036990 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-run-multus-certs\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.037147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037030 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-etc-openvswitch\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.037147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-run-ovn-kubernetes\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.037147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037118 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-multus-cni-dir\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037175 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-run-systemd\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037202 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96372a97-6aae-4e84-bcb3-d4446e5315ed-tmp-dir\") pod \"node-resolver-d76kb\" (UID: \"96372a97-6aae-4e84-bcb3-d4446e5315ed\") " pod="openshift-dns/node-resolver-d76kb" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037226 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-system-cni-dir\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037251 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-run-netns\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037276 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vtlh\" (UniqueName: \"kubernetes.io/projected/e741af54-1487-45cb-b5ea-22577783221e-kube-api-access-7vtlh\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037301 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpt5q\" (UniqueName: \"kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q\") pod \"network-check-target-fsnhs\" (UID: \"a85a3a29-77c8-4cb9-b029-9f13cfa79162\") " pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037324 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klh47\" (UniqueName: \"kubernetes.io/projected/96372a97-6aae-4e84-bcb3-d4446e5315ed-kube-api-access-klh47\") pod \"node-resolver-d76kb\" (UID: \"96372a97-6aae-4e84-bcb3-d4446e5315ed\") " pod="openshift-dns/node-resolver-d76kb" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037348 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b919031c-5663-434c-ad1b-7f33fd17b752-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037394 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-run-netns\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037420 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-cni-netd\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037446 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037472 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037515 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-registration-dir\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037557 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4ac72934-9084-4fe8-82a7-ad61ce635cb5-konnectivity-ca\") pod \"konnectivity-agent-j98vd\" (UID: \"4ac72934-9084-4fe8-82a7-ad61ce635cb5\") " pod="kube-system/konnectivity-agent-j98vd" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037584 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d751b5d6-3a64-4cdd-addf-8bc01e1ef230-host\") pod \"node-ca-whszt\" (UID: \"d751b5d6-3a64-4cdd-addf-8bc01e1ef230\") " pod="openshift-image-registry/node-ca-whszt" Apr 22 18:46:56.037622 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037607 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b919031c-5663-434c-ad1b-7f33fd17b752-os-release\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037649 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48xc\" (UniqueName: \"kubernetes.io/projected/b919031c-5663-434c-ad1b-7f33fd17b752-kube-api-access-s48xc\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037686 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e741af54-1487-45cb-b5ea-22577783221e-ovnkube-config\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037722 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e741af54-1487-45cb-b5ea-22577783221e-ovnkube-script-lib\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037745 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lpdd\" (UniqueName: \"kubernetes.io/projected/89068fbb-3d09-49a2-a369-e83cb48aaf0f-kube-api-access-2lpdd\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b919031c-5663-434c-ad1b-7f33fd17b752-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037839 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-var-lib-kubelet\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037866 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4ac72934-9084-4fe8-82a7-ad61ce635cb5-agent-certs\") pod \"konnectivity-agent-j98vd\" (UID: \"4ac72934-9084-4fe8-82a7-ad61ce635cb5\") " pod="kube-system/konnectivity-agent-j98vd" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037890 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-run-k8s-cni-cncf-io\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037914 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-var-lib-cni-multus\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037938 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-systemd-units\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.037960 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84618b00-ccd9-4eef-b5aa-c79f222e29a9-cni-binary-copy\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.038027 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-kubelet\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.038055 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-run-ovn\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.038090 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96372a97-6aae-4e84-bcb3-d4446e5315ed-hosts-file\") pod \"node-resolver-d76kb\" (UID: \"96372a97-6aae-4e84-bcb3-d4446e5315ed\") " pod="openshift-dns/node-resolver-d76kb" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.038116 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-var-lib-cni-bin\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.038262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.038139 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-var-lib-openvswitch\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.038921 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.038162 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-node-log\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.038921 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.038185 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-log-socket\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.038921 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.038215 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-device-dir\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.038921 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.038254 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b919031c-5663-434c-ad1b-7f33fd17b752-cni-binary-copy\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.038921 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.038290 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-os-release\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.058018 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.057973 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:55 +0000 UTC" deadline="2027-12-10 04:22:39.622953464 +0000 UTC" Apr 22 18:46:56.058018 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.058002 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14313h35m43.564955986s" Apr 22 18:46:56.118865 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.118834 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:56.125085 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.125064 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:46:56.133470 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.133441 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:56.139144 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139113 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e9c820a3-1c2e-4d37-bfc4-d18d772642d3-iptables-alerter-script\") pod \"iptables-alerter-9lgpr\" (UID: \"e9c820a3-1c2e-4d37-bfc4-d18d772642d3\") " pod="openshift-network-operator/iptables-alerter-9lgpr" Apr 22 18:46:56.139144 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139155 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9c820a3-1c2e-4d37-bfc4-d18d772642d3-host-slash\") pod \"iptables-alerter-9lgpr\" (UID: \"e9c820a3-1c2e-4d37-bfc4-d18d772642d3\") " pod="openshift-network-operator/iptables-alerter-9lgpr" Apr 22 18:46:56.139349 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139183 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-cnibin\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.139349 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139206 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-multus-socket-dir-parent\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.139349 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139231 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-run-multus-certs\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.139349 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139258 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/00b1549e-8f7a-42ba-a69c-02f254fd3955-tmp\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.139349 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139283 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-etc-openvswitch\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.139349 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139302 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-cnibin\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.139349 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-run-ovn-kubernetes\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.139349 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139347 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-multus-cni-dir\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.139349 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139354 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-run-ovn-kubernetes\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-run-systemd\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139407 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96372a97-6aae-4e84-bcb3-d4446e5315ed-tmp-dir\") pod \"node-resolver-d76kb\" (UID: \"96372a97-6aae-4e84-bcb3-d4446e5315ed\") " pod="openshift-dns/node-resolver-d76kb" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139436 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-system-cni-dir\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139439 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-run-multus-certs\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139460 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-run-netns\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139511 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-etc-openvswitch\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139521 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9c820a3-1c2e-4d37-bfc4-d18d772642d3-host-slash\") pod \"iptables-alerter-9lgpr\" (UID: \"e9c820a3-1c2e-4d37-bfc4-d18d772642d3\") " pod="openshift-network-operator/iptables-alerter-9lgpr" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-systemd\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vtlh\" (UniqueName: \"kubernetes.io/projected/e741af54-1487-45cb-b5ea-22577783221e-kube-api-access-7vtlh\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139575 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-multus-cni-dir\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt5q\" (UniqueName: \"kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q\") pod \"network-check-target-fsnhs\" (UID: \"a85a3a29-77c8-4cb9-b029-9f13cfa79162\") " pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139610 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-run-systemd\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139621 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klh47\" (UniqueName: \"kubernetes.io/projected/96372a97-6aae-4e84-bcb3-d4446e5315ed-kube-api-access-klh47\") pod \"node-resolver-d76kb\" (UID: \"96372a97-6aae-4e84-bcb3-d4446e5315ed\") " pod="openshift-dns/node-resolver-d76kb" Apr 22 18:46:56.139794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139738 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e9c820a3-1c2e-4d37-bfc4-d18d772642d3-iptables-alerter-script\") pod \"iptables-alerter-9lgpr\" (UID: \"e9c820a3-1c2e-4d37-bfc4-d18d772642d3\") " pod="openshift-network-operator/iptables-alerter-9lgpr" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139910 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96372a97-6aae-4e84-bcb3-d4446e5315ed-tmp-dir\") pod \"node-resolver-d76kb\" (UID: \"96372a97-6aae-4e84-bcb3-d4446e5315ed\") " pod="openshift-dns/node-resolver-d76kb" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139965 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b919031c-5663-434c-ad1b-7f33fd17b752-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139994 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-tuned\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140018 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-run-netns\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140051 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-cni-netd\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140075 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140100 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140133 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-run\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-host\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140178 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-registration-dir\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4ac72934-9084-4fe8-82a7-ad61ce635cb5-konnectivity-ca\") pod \"konnectivity-agent-j98vd\" (UID: \"4ac72934-9084-4fe8-82a7-ad61ce635cb5\") " pod="kube-system/konnectivity-agent-j98vd" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140239 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lts4\" (UniqueName: \"kubernetes.io/projected/00b1549e-8f7a-42ba-a69c-02f254fd3955-kube-api-access-5lts4\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140264 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d751b5d6-3a64-4cdd-addf-8bc01e1ef230-host\") pod \"node-ca-whszt\" (UID: \"d751b5d6-3a64-4cdd-addf-8bc01e1ef230\") " pod="openshift-image-registry/node-ca-whszt" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.139408 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-multus-socket-dir-parent\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b919031c-5663-434c-ad1b-7f33fd17b752-os-release\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140318 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s48xc\" (UniqueName: \"kubernetes.io/projected/b919031c-5663-434c-ad1b-7f33fd17b752-kube-api-access-s48xc\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.140523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140344 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e741af54-1487-45cb-b5ea-22577783221e-ovnkube-config\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140343 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-run-netns\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e741af54-1487-45cb-b5ea-22577783221e-ovnkube-script-lib\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140399 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lpdd\" (UniqueName: \"kubernetes.io/projected/89068fbb-3d09-49a2-a369-e83cb48aaf0f-kube-api-access-2lpdd\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140424 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b919031c-5663-434c-ad1b-7f33fd17b752-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140448 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-var-lib-kubelet\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140471 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4ac72934-9084-4fe8-82a7-ad61ce635cb5-agent-certs\") pod \"konnectivity-agent-j98vd\" (UID: \"4ac72934-9084-4fe8-82a7-ad61ce635cb5\") " pod="kube-system/konnectivity-agent-j98vd" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140477 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b919031c-5663-434c-ad1b-7f33fd17b752-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140539 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-run-k8s-cni-cncf-io\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140552 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-var-lib-cni-multus\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140599 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-sys\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140605 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140629 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-systemd-units\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140640 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-run-netns\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140656 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-sysctl-conf\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140673 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b919031c-5663-434c-ad1b-7f33fd17b752-os-release\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.141298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140686 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84618b00-ccd9-4eef-b5aa-c79f222e29a9-cni-binary-copy\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140683 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-system-cni-dir\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140721 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-var-lib-kubelet\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140754 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-kubelet\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140783 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-run-ovn\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140842 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-run-ovn\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140934 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-run-k8s-cni-cncf-io\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.140976 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-var-lib-cni-multus\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141002 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96372a97-6aae-4e84-bcb3-d4446e5315ed-hosts-file\") pod \"node-resolver-d76kb\" (UID: \"96372a97-6aae-4e84-bcb3-d4446e5315ed\") " pod="openshift-dns/node-resolver-d76kb" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141041 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-var-lib-cni-bin\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141178 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141250 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84618b00-ccd9-4eef-b5aa-c79f222e29a9-cni-binary-copy\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141327 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-systemd-units\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-var-lib-openvswitch\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-var-lib-openvswitch\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141379 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-node-log\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141398 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-var-lib-kubelet\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141406 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-log-socket\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.141977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141325 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e741af54-1487-45cb-b5ea-22577783221e-ovnkube-config\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141419 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-registration-dir\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-device-dir\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141472 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d751b5d6-3a64-4cdd-addf-8bc01e1ef230-host\") pod \"node-ca-whszt\" (UID: \"d751b5d6-3a64-4cdd-addf-8bc01e1ef230\") " pod="openshift-image-registry/node-ca-whszt" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141510 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-device-dir\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141515 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b919031c-5663-434c-ad1b-7f33fd17b752-cni-binary-copy\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141543 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-os-release\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141552 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-cni-netd\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141555 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-host-var-lib-cni-bin\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-multus-conf-dir\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141562 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e741af54-1487-45cb-b5ea-22577783221e-ovnkube-script-lib\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141594 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-kubelet\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141595 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-slash\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141598 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-log-socket\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141626 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-multus-conf-dir\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141650 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e741af54-1487-45cb-b5ea-22577783221e-env-overrides\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141660 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-node-log\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141652 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-os-release\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.142783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96372a97-6aae-4e84-bcb3-d4446e5315ed-hosts-file\") pod \"node-resolver-d76kb\" (UID: \"96372a97-6aae-4e84-bcb3-d4446e5315ed\") " pod="openshift-dns/node-resolver-d76kb" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141679 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-etc-selinux\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141716 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66lmd\" (UniqueName: \"kubernetes.io/projected/1935b284-bdc2-4d26-b763-4142392b2f07-kube-api-access-66lmd\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141749 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141895 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b919031c-5663-434c-ad1b-7f33fd17b752-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141955 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-etc-selinux\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.141974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b919031c-5663-434c-ad1b-7f33fd17b752-system-cni-dir\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.141993 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142044 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-hostroot\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142080 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-etc-kubernetes\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142085 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b919031c-5663-434c-ad1b-7f33fd17b752-system-cni-dir\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142097 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b919031c-5663-434c-ad1b-7f33fd17b752-cni-binary-copy\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142120 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e741af54-1487-45cb-b5ea-22577783221e-env-overrides\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142171 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-etc-kubernetes\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142170 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-slash\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.142178 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs podName:89068fbb-3d09-49a2-a369-e83cb48aaf0f nodeName:}" failed. No retries permitted until 2026-04-22 18:46:56.642149488 +0000 UTC m=+3.090931806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs") pod "network-metrics-daemon-z69xm" (UID: "89068fbb-3d09-49a2-a369-e83cb48aaf0f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/84618b00-ccd9-4eef-b5aa-c79f222e29a9-hostroot\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.143558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b919031c-5663-434c-ad1b-7f33fd17b752-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142259 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/84618b00-ccd9-4eef-b5aa-c79f222e29a9-multus-daemon-config\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142346 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dz5zv\" (UniqueName: \"kubernetes.io/projected/84618b00-ccd9-4eef-b5aa-c79f222e29a9-kube-api-access-dz5zv\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142393 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-modprobe-d\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142550 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-sysconfig\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-kubernetes\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142610 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-sysctl-d\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142628 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-lib-modules\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-run-openvswitch\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142694 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e741af54-1487-45cb-b5ea-22577783221e-ovn-node-metrics-cert\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142713 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-sys-fs\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d751b5d6-3a64-4cdd-addf-8bc01e1ef230-serviceca\") pod \"node-ca-whszt\" (UID: \"d751b5d6-3a64-4cdd-addf-8bc01e1ef230\") " pod="openshift-image-registry/node-ca-whszt" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142754 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b919031c-5663-434c-ad1b-7f33fd17b752-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142764 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjck2\" (UniqueName: \"kubernetes.io/projected/d751b5d6-3a64-4cdd-addf-8bc01e1ef230-kube-api-access-hjck2\") pod \"node-ca-whszt\" (UID: \"d751b5d6-3a64-4cdd-addf-8bc01e1ef230\") " pod="openshift-image-registry/node-ca-whszt" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142784 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jjdd\" (UniqueName: \"kubernetes.io/projected/e9c820a3-1c2e-4d37-bfc4-d18d772642d3-kube-api-access-6jjdd\") pod \"iptables-alerter-9lgpr\" (UID: \"e9c820a3-1c2e-4d37-bfc4-d18d772642d3\") " pod="openshift-network-operator/iptables-alerter-9lgpr" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142807 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b919031c-5663-434c-ad1b-7f33fd17b752-cnibin\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-run-openvswitch\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.144283 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142818 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4ac72934-9084-4fe8-82a7-ad61ce635cb5-konnectivity-ca\") pod \"konnectivity-agent-j98vd\" (UID: \"4ac72934-9084-4fe8-82a7-ad61ce635cb5\") " pod="kube-system/konnectivity-agent-j98vd" Apr 22 18:46:56.145034 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142804 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/84618b00-ccd9-4eef-b5aa-c79f222e29a9-multus-daemon-config\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.145034 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-cni-bin\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.145034 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.142988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-socket-dir\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.145034 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.143016 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-sys-fs\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.145034 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.143081 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b919031c-5663-434c-ad1b-7f33fd17b752-cnibin\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.145034 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.143133 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e741af54-1487-45cb-b5ea-22577783221e-host-cni-bin\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.145034 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.143158 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1935b284-bdc2-4d26-b763-4142392b2f07-socket-dir\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.145034 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.143278 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d751b5d6-3a64-4cdd-addf-8bc01e1ef230-serviceca\") pod \"node-ca-whszt\" (UID: \"d751b5d6-3a64-4cdd-addf-8bc01e1ef230\") " pod="openshift-image-registry/node-ca-whszt" Apr 22 18:46:56.145452 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.145434 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:56.145514 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.145459 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:56.145514 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.145473 2579 projected.go:194] Error preparing data for projected volume kube-api-access-vpt5q for pod openshift-network-diagnostics/network-check-target-fsnhs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:56.145601 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.145554 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q podName:a85a3a29-77c8-4cb9-b029-9f13cfa79162 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:56.645536444 +0000 UTC m=+3.094318769 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vpt5q" (UniqueName: "kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q") pod "network-check-target-fsnhs" (UID: "a85a3a29-77c8-4cb9-b029-9f13cfa79162") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:56.145857 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.145834 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e741af54-1487-45cb-b5ea-22577783221e-ovn-node-metrics-cert\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.145933 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.145860 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4ac72934-9084-4fe8-82a7-ad61ce635cb5-agent-certs\") pod \"konnectivity-agent-j98vd\" (UID: \"4ac72934-9084-4fe8-82a7-ad61ce635cb5\") " pod="kube-system/konnectivity-agent-j98vd" Apr 22 18:46:56.147934 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.147905 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klh47\" (UniqueName: \"kubernetes.io/projected/96372a97-6aae-4e84-bcb3-d4446e5315ed-kube-api-access-klh47\") pod \"node-resolver-d76kb\" (UID: \"96372a97-6aae-4e84-bcb3-d4446e5315ed\") " pod="openshift-dns/node-resolver-d76kb" Apr 22 18:46:56.148403 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.148381 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vtlh\" (UniqueName: \"kubernetes.io/projected/e741af54-1487-45cb-b5ea-22577783221e-kube-api-access-7vtlh\") pod \"ovnkube-node-tcrqv\" (UID: \"e741af54-1487-45cb-b5ea-22577783221e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.148937 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.148914 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48xc\" (UniqueName: \"kubernetes.io/projected/b919031c-5663-434c-ad1b-7f33fd17b752-kube-api-access-s48xc\") pod \"multus-additional-cni-plugins-b6nbf\" (UID: \"b919031c-5663-434c-ad1b-7f33fd17b752\") " pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.149292 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.149249 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lpdd\" (UniqueName: \"kubernetes.io/projected/89068fbb-3d09-49a2-a369-e83cb48aaf0f-kube-api-access-2lpdd\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:46:56.150088 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.150068 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66lmd\" (UniqueName: \"kubernetes.io/projected/1935b284-bdc2-4d26-b763-4142392b2f07-kube-api-access-66lmd\") pod \"aws-ebs-csi-driver-node-vtnlw\" (UID: \"1935b284-bdc2-4d26-b763-4142392b2f07\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.150560 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.150533 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjck2\" (UniqueName: \"kubernetes.io/projected/d751b5d6-3a64-4cdd-addf-8bc01e1ef230-kube-api-access-hjck2\") pod \"node-ca-whszt\" (UID: \"d751b5d6-3a64-4cdd-addf-8bc01e1ef230\") " pod="openshift-image-registry/node-ca-whszt" Apr 22 18:46:56.151076 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.151046 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz5zv\" (UniqueName: \"kubernetes.io/projected/84618b00-ccd9-4eef-b5aa-c79f222e29a9-kube-api-access-dz5zv\") pod \"multus-ssc8m\" (UID: \"84618b00-ccd9-4eef-b5aa-c79f222e29a9\") " pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.151860 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.151839 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jjdd\" (UniqueName: \"kubernetes.io/projected/e9c820a3-1c2e-4d37-bfc4-d18d772642d3-kube-api-access-6jjdd\") pod \"iptables-alerter-9lgpr\" (UID: \"e9c820a3-1c2e-4d37-bfc4-d18d772642d3\") " pod="openshift-network-operator/iptables-alerter-9lgpr" Apr 22 18:46:56.243840 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.243803 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-sys\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244003 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.243849 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-sysctl-conf\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244003 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.243878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-var-lib-kubelet\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244003 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.243927 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-modprobe-d\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244003 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.243949 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-sysconfig\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244003 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.243974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-kubernetes\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244003 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.243993 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-var-lib-kubelet\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244021 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-sysconfig\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244032 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-kubernetes\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.243945 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-sys\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244049 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-sysctl-d\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244050 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-sysctl-conf\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244080 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-lib-modules\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-modprobe-d\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244150 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/00b1549e-8f7a-42ba-a69c-02f254fd3955-tmp\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244153 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-sysctl-d\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-systemd\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244224 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-tuned\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244241 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-lib-modules\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244252 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-run\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244287 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-systemd\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-host\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.244362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244332 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lts4\" (UniqueName: \"kubernetes.io/projected/00b1549e-8f7a-42ba-a69c-02f254fd3955-kube-api-access-5lts4\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.245119 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244439 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-host\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.245119 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.244502 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/00b1549e-8f7a-42ba-a69c-02f254fd3955-run\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.246685 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.246659 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/00b1549e-8f7a-42ba-a69c-02f254fd3955-tmp\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.246824 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.246808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/00b1549e-8f7a-42ba-a69c-02f254fd3955-etc-tuned\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.251916 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.251856 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lts4\" (UniqueName: \"kubernetes.io/projected/00b1549e-8f7a-42ba-a69c-02f254fd3955-kube-api-access-5lts4\") pod \"tuned-pzgs4\" (UID: \"00b1549e-8f7a-42ba-a69c-02f254fd3955\") " pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.324773 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.324732 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-whszt" Apr 22 18:46:56.331666 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.331644 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b6nbf" Apr 22 18:46:56.340271 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.340245 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9lgpr" Apr 22 18:46:56.345957 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.345926 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d76kb" Apr 22 18:46:56.352603 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.352580 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ssc8m" Apr 22 18:46:56.359260 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.359241 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:46:56.365971 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.365949 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j98vd" Apr 22 18:46:56.373672 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.373648 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" Apr 22 18:46:56.379311 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.379290 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" Apr 22 18:46:56.646596 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.646525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:46:56.646596 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:56.646567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt5q\" (UniqueName: \"kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q\") pod \"network-check-target-fsnhs\" (UID: \"a85a3a29-77c8-4cb9-b029-9f13cfa79162\") " pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:46:56.646792 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.646723 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:56.646840 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.646806 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs podName:89068fbb-3d09-49a2-a369-e83cb48aaf0f nodeName:}" failed. No retries permitted until 2026-04-22 18:46:57.646783193 +0000 UTC m=+4.095565519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs") pod "network-metrics-daemon-z69xm" (UID: "89068fbb-3d09-49a2-a369-e83cb48aaf0f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:56.646904 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.646844 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:56.646904 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.646866 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:56.646904 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.646879 2579 projected.go:194] Error preparing data for projected volume kube-api-access-vpt5q for pod openshift-network-diagnostics/network-check-target-fsnhs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:56.647029 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:56.646961 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q podName:a85a3a29-77c8-4cb9-b029-9f13cfa79162 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:57.646945589 +0000 UTC m=+4.095727912 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vpt5q" (UniqueName: "kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q") pod "network-check-target-fsnhs" (UID: "a85a3a29-77c8-4cb9-b029-9f13cfa79162") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:56.793073 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:56.793038 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1935b284_bdc2_4d26_b763_4142392b2f07.slice/crio-a19f3dfdaff509cee33b413c286985199576d1e02d3dcc8e391c6abcdb70bb12 WatchSource:0}: Error finding container a19f3dfdaff509cee33b413c286985199576d1e02d3dcc8e391c6abcdb70bb12: Status 404 returned error can't find the container with id a19f3dfdaff509cee33b413c286985199576d1e02d3dcc8e391c6abcdb70bb12 Apr 22 18:46:56.798679 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:56.798655 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ac72934_9084_4fe8_82a7_ad61ce635cb5.slice/crio-16cdda5a665204a38b5ad8c5bd7c3d0259ce314b2bd5195e6440aa1a4c5080ae WatchSource:0}: Error finding container 16cdda5a665204a38b5ad8c5bd7c3d0259ce314b2bd5195e6440aa1a4c5080ae: Status 404 returned error can't find the container with id 16cdda5a665204a38b5ad8c5bd7c3d0259ce314b2bd5195e6440aa1a4c5080ae Apr 22 18:46:56.798904 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:56.798880 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96372a97_6aae_4e84_bcb3_d4446e5315ed.slice/crio-d363ff982e39e30532f9af2024b1320a6afdaa45adb59777793a157a9d98bf21 WatchSource:0}: Error finding container d363ff982e39e30532f9af2024b1320a6afdaa45adb59777793a157a9d98bf21: Status 404 returned error can't find the container with id d363ff982e39e30532f9af2024b1320a6afdaa45adb59777793a157a9d98bf21 Apr 22 18:46:56.799614 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:56.799587 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode741af54_1487_45cb_b5ea_22577783221e.slice/crio-ed02f2b28fb04d52cfe59389c5cc221ca79e2a7b8ec865047fd57709fb099be4 WatchSource:0}: Error finding container ed02f2b28fb04d52cfe59389c5cc221ca79e2a7b8ec865047fd57709fb099be4: Status 404 returned error can't find the container with id ed02f2b28fb04d52cfe59389c5cc221ca79e2a7b8ec865047fd57709fb099be4 Apr 22 18:46:56.804831 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:56.804790 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b1549e_8f7a_42ba_a69c_02f254fd3955.slice/crio-b9dbe764e6038cf2e1d8c677eeb7483c402c4aafbeb3ed1f5c8d66389f68f3ce WatchSource:0}: Error finding container b9dbe764e6038cf2e1d8c677eeb7483c402c4aafbeb3ed1f5c8d66389f68f3ce: Status 404 returned error can't find the container with id b9dbe764e6038cf2e1d8c677eeb7483c402c4aafbeb3ed1f5c8d66389f68f3ce Apr 22 18:46:56.806949 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:56.806925 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84618b00_ccd9_4eef_b5aa_c79f222e29a9.slice/crio-56c9bf79df19676ae36599371b9f04ac528ecb3699abfce3c08b2f40fa6f3445 WatchSource:0}: Error finding container 56c9bf79df19676ae36599371b9f04ac528ecb3699abfce3c08b2f40fa6f3445: Status 404 returned error can't find the container with id 56c9bf79df19676ae36599371b9f04ac528ecb3699abfce3c08b2f40fa6f3445 Apr 22 18:46:56.807977 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:56.807934 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c820a3_1c2e_4d37_bfc4_d18d772642d3.slice/crio-d9c928863212869307998ff91c0ba799a7b1469246c4728a7c66bccc2728e0f5 WatchSource:0}: Error finding container d9c928863212869307998ff91c0ba799a7b1469246c4728a7c66bccc2728e0f5: Status 404 returned error can't find the container with id d9c928863212869307998ff91c0ba799a7b1469246c4728a7c66bccc2728e0f5 Apr 22 18:46:56.809340 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:56.809286 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb919031c_5663_434c_ad1b_7f33fd17b752.slice/crio-eb8dd3107be5fa2017f1789065f2a2a2e778085e48808f8ec4805965b148d87c WatchSource:0}: Error finding container eb8dd3107be5fa2017f1789065f2a2a2e778085e48808f8ec4805965b148d87c: Status 404 returned error can't find the container with id eb8dd3107be5fa2017f1789065f2a2a2e778085e48808f8ec4805965b148d87c Apr 22 18:46:56.811120 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:46:56.811092 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd751b5d6_3a64_4cdd_addf_8bc01e1ef230.slice/crio-a1e34d5f8024a69018794b6f53083bde59db9282315a2b80a4225c3a3f1ba9ed WatchSource:0}: Error finding container a1e34d5f8024a69018794b6f53083bde59db9282315a2b80a4225c3a3f1ba9ed: Status 404 returned error can't find the container with id a1e34d5f8024a69018794b6f53083bde59db9282315a2b80a4225c3a3f1ba9ed Apr 22 18:46:57.058781 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.058575 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:55 +0000 UTC" deadline="2027-09-24 02:15:11.429261058 +0000 UTC" Apr 22 18:46:57.058781 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.058778 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12463h28m14.370487654s" Apr 22 18:46:57.159458 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.159424 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:46:57.159644 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:57.159572 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:46:57.169641 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.169597 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-53.ec2.internal" event={"ID":"837597ad9723f3b9499e99b045f4e6a9","Type":"ContainerStarted","Data":"821ba2040fda7b1b7d67a29d7d89188473107f6fdf89363570f66cf344367fef"} Apr 22 18:46:57.177045 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.177013 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-whszt" event={"ID":"d751b5d6-3a64-4cdd-addf-8bc01e1ef230","Type":"ContainerStarted","Data":"a1e34d5f8024a69018794b6f53083bde59db9282315a2b80a4225c3a3f1ba9ed"} Apr 22 18:46:57.179810 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.179767 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6nbf" event={"ID":"b919031c-5663-434c-ad1b-7f33fd17b752","Type":"ContainerStarted","Data":"eb8dd3107be5fa2017f1789065f2a2a2e778085e48808f8ec4805965b148d87c"} Apr 22 18:46:57.181836 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.181795 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9lgpr" event={"ID":"e9c820a3-1c2e-4d37-bfc4-d18d772642d3","Type":"ContainerStarted","Data":"d9c928863212869307998ff91c0ba799a7b1469246c4728a7c66bccc2728e0f5"} Apr 22 18:46:57.183802 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.183776 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" event={"ID":"e741af54-1487-45cb-b5ea-22577783221e","Type":"ContainerStarted","Data":"ed02f2b28fb04d52cfe59389c5cc221ca79e2a7b8ec865047fd57709fb099be4"} Apr 22 18:46:57.186041 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.186008 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j98vd" event={"ID":"4ac72934-9084-4fe8-82a7-ad61ce635cb5","Type":"ContainerStarted","Data":"16cdda5a665204a38b5ad8c5bd7c3d0259ce314b2bd5195e6440aa1a4c5080ae"} Apr 22 18:46:57.191563 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.191512 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ssc8m" event={"ID":"84618b00-ccd9-4eef-b5aa-c79f222e29a9","Type":"ContainerStarted","Data":"56c9bf79df19676ae36599371b9f04ac528ecb3699abfce3c08b2f40fa6f3445"} Apr 22 18:46:57.195684 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.195629 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" event={"ID":"00b1549e-8f7a-42ba-a69c-02f254fd3955","Type":"ContainerStarted","Data":"b9dbe764e6038cf2e1d8c677eeb7483c402c4aafbeb3ed1f5c8d66389f68f3ce"} Apr 22 18:46:57.200363 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.200334 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d76kb" event={"ID":"96372a97-6aae-4e84-bcb3-d4446e5315ed","Type":"ContainerStarted","Data":"d363ff982e39e30532f9af2024b1320a6afdaa45adb59777793a157a9d98bf21"} Apr 22 18:46:57.203403 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.203366 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" event={"ID":"1935b284-bdc2-4d26-b763-4142392b2f07","Type":"ContainerStarted","Data":"a19f3dfdaff509cee33b413c286985199576d1e02d3dcc8e391c6abcdb70bb12"} Apr 22 18:46:57.655517 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.655248 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt5q\" (UniqueName: \"kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q\") pod \"network-check-target-fsnhs\" (UID: \"a85a3a29-77c8-4cb9-b029-9f13cfa79162\") " pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:46:57.655517 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:57.655337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:46:57.655517 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:57.655422 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:57.655517 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:57.655450 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:57.655517 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:57.655465 2579 projected.go:194] Error preparing data for projected volume kube-api-access-vpt5q for pod openshift-network-diagnostics/network-check-target-fsnhs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:57.655517 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:57.655477 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:57.655928 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:57.655542 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q podName:a85a3a29-77c8-4cb9-b029-9f13cfa79162 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:59.655522046 +0000 UTC m=+6.104304371 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vpt5q" (UniqueName: "kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q") pod "network-check-target-fsnhs" (UID: "a85a3a29-77c8-4cb9-b029-9f13cfa79162") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:57.655928 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:57.655577 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs podName:89068fbb-3d09-49a2-a369-e83cb48aaf0f nodeName:}" failed. No retries permitted until 2026-04-22 18:46:59.655556217 +0000 UTC m=+6.104338528 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs") pod "network-metrics-daemon-z69xm" (UID: "89068fbb-3d09-49a2-a369-e83cb48aaf0f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:58.159776 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:58.159739 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:46:58.160238 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:58.159890 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:46:58.215311 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:58.215224 2579 generic.go:358] "Generic (PLEG): container finished" podID="890c59873ddf29c2d5fdc1745f24f9dc" containerID="7a554fd3d65ded23eb1f905e904f237881ce9fffba8905c2d6caec15443bfaea" exitCode=0 Apr 22 18:46:58.215520 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:58.215312 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" event={"ID":"890c59873ddf29c2d5fdc1745f24f9dc","Type":"ContainerDied","Data":"7a554fd3d65ded23eb1f905e904f237881ce9fffba8905c2d6caec15443bfaea"} Apr 22 18:46:58.227596 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:58.227542 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-53.ec2.internal" podStartSLOduration=3.227521828 podStartE2EDuration="3.227521828s" podCreationTimestamp="2026-04-22 18:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:57.181895402 +0000 UTC m=+3.630677734" watchObservedRunningTime="2026-04-22 18:46:58.227521828 +0000 UTC m=+4.676304162" Apr 22 18:46:59.163507 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:59.159507 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:46:59.163507 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:59.159864 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:46:59.224092 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:59.224048 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" event={"ID":"890c59873ddf29c2d5fdc1745f24f9dc","Type":"ContainerStarted","Data":"54c5bfe3c9da7a2329e0a550a1175cfd7e7c18f99217c142c7177a4c34cab874"} Apr 22 18:46:59.237640 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:59.237582 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-53.ec2.internal" podStartSLOduration=4.237562389 podStartE2EDuration="4.237562389s" podCreationTimestamp="2026-04-22 18:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:59.237146142 +0000 UTC m=+5.685928472" watchObservedRunningTime="2026-04-22 18:46:59.237562389 +0000 UTC m=+5.686344720" Apr 22 18:46:59.673823 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:59.673781 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt5q\" (UniqueName: \"kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q\") pod \"network-check-target-fsnhs\" (UID: \"a85a3a29-77c8-4cb9-b029-9f13cfa79162\") " pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:46:59.674017 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:46:59.673896 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:46:59.674079 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:59.674024 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:59.674146 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:59.674087 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs podName:89068fbb-3d09-49a2-a369-e83cb48aaf0f nodeName:}" failed. No retries permitted until 2026-04-22 18:47:03.67406856 +0000 UTC m=+10.122850888 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs") pod "network-metrics-daemon-z69xm" (UID: "89068fbb-3d09-49a2-a369-e83cb48aaf0f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:59.674539 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:59.674520 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:59.674636 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:59.674543 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:59.674636 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:59.674557 2579 projected.go:194] Error preparing data for projected volume kube-api-access-vpt5q for pod openshift-network-diagnostics/network-check-target-fsnhs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:59.674636 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:46:59.674605 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q podName:a85a3a29-77c8-4cb9-b029-9f13cfa79162 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:03.674589167 +0000 UTC m=+10.123371480 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vpt5q" (UniqueName: "kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q") pod "network-check-target-fsnhs" (UID: "a85a3a29-77c8-4cb9-b029-9f13cfa79162") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:00.160288 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:00.159703 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:00.160288 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:00.159893 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:01.159263 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:01.159229 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:01.159722 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:01.159359 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:02.159111 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:02.159078 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:02.159316 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:02.159219 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:03.160376 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:03.159909 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:03.160376 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:03.160037 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:03.708331 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:03.707902 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt5q\" (UniqueName: \"kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q\") pod \"network-check-target-fsnhs\" (UID: \"a85a3a29-77c8-4cb9-b029-9f13cfa79162\") " pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:03.708331 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:03.707964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:03.708331 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:03.708078 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:03.708331 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:03.708137 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs podName:89068fbb-3d09-49a2-a369-e83cb48aaf0f nodeName:}" failed. No retries permitted until 2026-04-22 18:47:11.70811849 +0000 UTC m=+18.156900821 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs") pod "network-metrics-daemon-z69xm" (UID: "89068fbb-3d09-49a2-a369-e83cb48aaf0f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:03.708331 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:03.708080 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:03.708331 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:03.708181 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:03.708331 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:03.708196 2579 projected.go:194] Error preparing data for projected volume kube-api-access-vpt5q for pod openshift-network-diagnostics/network-check-target-fsnhs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:03.708331 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:03.708237 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q podName:a85a3a29-77c8-4cb9-b029-9f13cfa79162 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:11.70822454 +0000 UTC m=+18.157006863 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vpt5q" (UniqueName: "kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q") pod "network-check-target-fsnhs" (UID: "a85a3a29-77c8-4cb9-b029-9f13cfa79162") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:04.161637 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:04.160954 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:04.161637 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:04.161083 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:05.160004 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:05.159965 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:05.160176 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:05.160104 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:06.160009 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:06.159908 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:06.160499 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:06.160050 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:07.159495 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:07.159446 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:07.159693 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:07.159574 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:08.159968 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:08.159926 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:08.160430 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:08.160077 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:09.159517 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:09.159470 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:09.159672 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:09.159600 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:10.159655 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:10.159612 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:10.160120 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:10.159740 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:11.159432 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:11.159392 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:11.159713 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:11.159556 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:11.768919 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:11.768874 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:11.769116 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:11.768932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt5q\" (UniqueName: \"kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q\") pod \"network-check-target-fsnhs\" (UID: \"a85a3a29-77c8-4cb9-b029-9f13cfa79162\") " pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:11.769116 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:11.769037 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:11.769116 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:11.769106 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs podName:89068fbb-3d09-49a2-a369-e83cb48aaf0f nodeName:}" failed. No retries permitted until 2026-04-22 18:47:27.769088941 +0000 UTC m=+34.217871249 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs") pod "network-metrics-daemon-z69xm" (UID: "89068fbb-3d09-49a2-a369-e83cb48aaf0f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:11.769286 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:11.769048 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:11.769286 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:11.769142 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:11.769286 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:11.769156 2579 projected.go:194] Error preparing data for projected volume kube-api-access-vpt5q for pod openshift-network-diagnostics/network-check-target-fsnhs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:11.769286 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:11.769198 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q podName:a85a3a29-77c8-4cb9-b029-9f13cfa79162 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:27.769186257 +0000 UTC m=+34.217968570 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vpt5q" (UniqueName: "kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q") pod "network-check-target-fsnhs" (UID: "a85a3a29-77c8-4cb9-b029-9f13cfa79162") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:12.159521 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:12.159414 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:12.159676 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:12.159572 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:13.159294 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:13.159235 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:13.159767 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:13.159353 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:14.160327 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.160029 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:14.160836 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:14.160403 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:14.254118 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.254025 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j98vd" event={"ID":"4ac72934-9084-4fe8-82a7-ad61ce635cb5","Type":"ContainerStarted","Data":"ddc59cd4a22a7045ad1252ca5aa9d12557b9c1b21d7603a085eda4bbef33870f"} Apr 22 18:47:14.255706 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.255665 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ssc8m" event={"ID":"84618b00-ccd9-4eef-b5aa-c79f222e29a9","Type":"ContainerStarted","Data":"c88479ec7b84473541927981821476d023d43428f78592d7777a4f8bba5b5db0"} Apr 22 18:47:14.257130 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.257105 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" event={"ID":"00b1549e-8f7a-42ba-a69c-02f254fd3955","Type":"ContainerStarted","Data":"b44fc655c9b0fac196dcd65ac988737ac9b58291a74c56cdd0b7ef9a7f214f7f"} Apr 22 18:47:14.258468 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.258442 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d76kb" event={"ID":"96372a97-6aae-4e84-bcb3-d4446e5315ed","Type":"ContainerStarted","Data":"35e902aa165f13f093ec10c8317174ecf4a4a77fcb8451d5f41da3c42bf68b71"} Apr 22 18:47:14.259873 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.259847 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" event={"ID":"1935b284-bdc2-4d26-b763-4142392b2f07","Type":"ContainerStarted","Data":"09efe1c1b4ad1e8fa413dffeb76f82efb829144daf1414ac5965fe7efdbf5bf4"} Apr 22 18:47:14.261232 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.261203 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-whszt" event={"ID":"d751b5d6-3a64-4cdd-addf-8bc01e1ef230","Type":"ContainerStarted","Data":"a476f152556f5da7c342c782d506b551f6f6d45ce7bdbe7e184850615ab252c4"} Apr 22 18:47:14.262725 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.262697 2579 generic.go:358] "Generic (PLEG): container finished" podID="b919031c-5663-434c-ad1b-7f33fd17b752" containerID="b0af12b870da9d18b9c0c8d5ba74f2ebacfffec28d504dcfa3bb0b97453b29b7" exitCode=0 Apr 22 18:47:14.262824 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.262781 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6nbf" event={"ID":"b919031c-5663-434c-ad1b-7f33fd17b752","Type":"ContainerDied","Data":"b0af12b870da9d18b9c0c8d5ba74f2ebacfffec28d504dcfa3bb0b97453b29b7"} Apr 22 18:47:14.265883 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.265818 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 18:47:14.266271 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.266184 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-j98vd" podStartSLOduration=3.565915094 podStartE2EDuration="20.26617128s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="2026-04-22 18:46:56.800387096 +0000 UTC m=+3.249169421" lastFinishedPulling="2026-04-22 18:47:13.500643278 +0000 UTC m=+19.949425607" observedRunningTime="2026-04-22 18:47:14.266100669 +0000 UTC m=+20.714883002" watchObservedRunningTime="2026-04-22 18:47:14.26617128 +0000 UTC m=+20.714953610" Apr 22 18:47:14.266362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.266334 2579 generic.go:358] "Generic (PLEG): container finished" podID="e741af54-1487-45cb-b5ea-22577783221e" containerID="37db104b4afb7999c89e7ced5b84250625390ed5498af676ee0f8e1d476607ac" exitCode=1 Apr 22 18:47:14.266431 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.266366 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" event={"ID":"e741af54-1487-45cb-b5ea-22577783221e","Type":"ContainerStarted","Data":"0c161aca7728b9c7c5b19340347b29635351224c05f62a315c6506a38ab66391"} Apr 22 18:47:14.266431 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.266385 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" event={"ID":"e741af54-1487-45cb-b5ea-22577783221e","Type":"ContainerStarted","Data":"f002a519b2446e8c2484d4654bcf2f0b61cc93f8a0baf379b3bfa969c7f8bf88"} Apr 22 18:47:14.266431 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.266399 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" event={"ID":"e741af54-1487-45cb-b5ea-22577783221e","Type":"ContainerDied","Data":"37db104b4afb7999c89e7ced5b84250625390ed5498af676ee0f8e1d476607ac"} Apr 22 18:47:14.266431 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.266413 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" event={"ID":"e741af54-1487-45cb-b5ea-22577783221e","Type":"ContainerStarted","Data":"5f589439f498afa0cd1df70288a8541444e55a64740bb6f54cb920a4f93e2c12"} Apr 22 18:47:14.278548 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.278472 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d76kb" podStartSLOduration=3.579276397 podStartE2EDuration="20.278454419s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="2026-04-22 18:46:56.800898472 +0000 UTC m=+3.249680781" lastFinishedPulling="2026-04-22 18:47:13.50007649 +0000 UTC m=+19.948858803" observedRunningTime="2026-04-22 18:47:14.278082837 +0000 UTC m=+20.726865167" watchObservedRunningTime="2026-04-22 18:47:14.278454419 +0000 UTC m=+20.727236751" Apr 22 18:47:14.294479 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.294424 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ssc8m" podStartSLOduration=3.566762531 podStartE2EDuration="20.294405699s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="2026-04-22 18:46:56.808912529 +0000 UTC m=+3.257694838" lastFinishedPulling="2026-04-22 18:47:13.536555686 +0000 UTC m=+19.985338006" observedRunningTime="2026-04-22 18:47:14.293789202 +0000 UTC m=+20.742571534" watchObservedRunningTime="2026-04-22 18:47:14.294405699 +0000 UTC m=+20.743188030" Apr 22 18:47:14.305425 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.305382 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-whszt" podStartSLOduration=3.618264392 podStartE2EDuration="20.30536956s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="2026-04-22 18:46:56.813017692 +0000 UTC m=+3.261800001" lastFinishedPulling="2026-04-22 18:47:13.50012286 +0000 UTC m=+19.948905169" observedRunningTime="2026-04-22 18:47:14.305154462 +0000 UTC m=+20.753936791" watchObservedRunningTime="2026-04-22 18:47:14.30536956 +0000 UTC m=+20.754151938" Apr 22 18:47:14.319495 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.319427 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-pzgs4" podStartSLOduration=3.624347952 podStartE2EDuration="20.319408335s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="2026-04-22 18:46:56.80697876 +0000 UTC m=+3.255761082" lastFinishedPulling="2026-04-22 18:47:13.502039144 +0000 UTC m=+19.950821465" observedRunningTime="2026-04-22 18:47:14.319195727 +0000 UTC m=+20.767978056" watchObservedRunningTime="2026-04-22 18:47:14.319408335 +0000 UTC m=+20.768190666" Apr 22 18:47:14.816757 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:14.816733 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:47:15.096551 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:15.096389 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:47:14.816753609Z","UUID":"c1e68f25-08ea-4285-833c-f6d044b91f87","Handler":null,"Name":"","Endpoint":""} Apr 22 18:47:15.098356 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:15.098328 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:47:15.098522 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:15.098367 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:47:15.159714 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:15.159680 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:15.159902 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:15.159810 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:15.271844 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:15.271799 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" event={"ID":"1935b284-bdc2-4d26-b763-4142392b2f07","Type":"ContainerStarted","Data":"c9357d6035f0c5fb58190a1d9c88986767dce87e90ae221bb8b1b03925d6e086"} Apr 22 18:47:15.274913 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:15.274860 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9lgpr" event={"ID":"e9c820a3-1c2e-4d37-bfc4-d18d772642d3","Type":"ContainerStarted","Data":"8e9fcbb24fe60d31620b4fc1bf9b81b9bea97af605ae2653071b9152f0bf7ee8"} Apr 22 18:47:15.278423 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:15.278352 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 18:47:15.278888 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:15.278838 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" event={"ID":"e741af54-1487-45cb-b5ea-22577783221e","Type":"ContainerStarted","Data":"147ed082fd24455ebf04419e27f188368b8152250d881cedb3a60122bbdf7df2"} Apr 22 18:47:15.278999 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:15.278897 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" event={"ID":"e741af54-1487-45cb-b5ea-22577783221e","Type":"ContainerStarted","Data":"43c42e128ccdaeed4f2808b321236292f56717fdf6e360037a175d8bcdc63ae2"} Apr 22 18:47:15.287262 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:15.287212 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9lgpr" podStartSLOduration=4.596231574 podStartE2EDuration="21.287197861s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="2026-04-22 18:46:56.809583481 +0000 UTC m=+3.258365790" lastFinishedPulling="2026-04-22 18:47:13.500549756 +0000 UTC m=+19.949332077" observedRunningTime="2026-04-22 18:47:15.28677068 +0000 UTC m=+21.735553011" watchObservedRunningTime="2026-04-22 18:47:15.287197861 +0000 UTC m=+21.735980194" Apr 22 18:47:15.296660 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:15.296630 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-j98vd" Apr 22 18:47:15.297305 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:15.297282 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-j98vd" Apr 22 18:47:16.160217 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:16.159948 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:16.160475 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:16.160264 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:16.282833 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:16.282791 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" event={"ID":"1935b284-bdc2-4d26-b763-4142392b2f07","Type":"ContainerStarted","Data":"4752606e4e150a993666e54a6b4c83a87f3a556c0def01bb21ce63c7fb72e60f"} Apr 22 18:47:17.159543 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:17.159506 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:17.159737 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:17.159625 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:17.288416 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:17.288386 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 18:47:17.288996 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:17.288775 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" event={"ID":"e741af54-1487-45cb-b5ea-22577783221e","Type":"ContainerStarted","Data":"1c4fa2d8eb4c8a869be735cb079d5d8f2489f3e02737c6084341e6ffc429ed8d"} Apr 22 18:47:17.288996 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:17.288836 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:18.159088 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:18.159054 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:18.159272 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:18.159193 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:19.160133 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:19.159948 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:19.160789 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:19.160202 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:19.293923 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:19.293886 2579 generic.go:358] "Generic (PLEG): container finished" podID="b919031c-5663-434c-ad1b-7f33fd17b752" containerID="58636812f5377ed6adf90013481a9fb2af7ed0f0554676901b1f8338525bb6c7" exitCode=0 Apr 22 18:47:19.294073 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:19.293959 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6nbf" event={"ID":"b919031c-5663-434c-ad1b-7f33fd17b752","Type":"ContainerDied","Data":"58636812f5377ed6adf90013481a9fb2af7ed0f0554676901b1f8338525bb6c7"} Apr 22 18:47:19.296872 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:19.296852 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 18:47:19.297152 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:19.297129 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" event={"ID":"e741af54-1487-45cb-b5ea-22577783221e","Type":"ContainerStarted","Data":"b2bdc610546d45f45f35af4ae2107e8120c9438c252bec53e36f60b14d1a0a12"} Apr 22 18:47:19.297571 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:19.297442 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:47:19.297571 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:19.297465 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:47:19.297719 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:19.297612 2579 scope.go:117] "RemoveContainer" containerID="37db104b4afb7999c89e7ced5b84250625390ed5498af676ee0f8e1d476607ac" Apr 22 18:47:19.312747 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:19.312689 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:47:19.312827 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:19.312792 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:47:19.313021 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:19.312971 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vtnlw" podStartSLOduration=6.326676555 podStartE2EDuration="25.312960356s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="2026-04-22 18:46:56.798920586 +0000 UTC m=+3.247702907" lastFinishedPulling="2026-04-22 18:47:15.785204395 +0000 UTC m=+22.233986708" observedRunningTime="2026-04-22 18:47:16.299428399 +0000 UTC m=+22.748210730" watchObservedRunningTime="2026-04-22 18:47:19.312960356 +0000 UTC m=+25.761742677" Apr 22 18:47:20.160031 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:20.159998 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:20.160188 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:20.160149 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:20.300626 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:20.300592 2579 generic.go:358] "Generic (PLEG): container finished" podID="b919031c-5663-434c-ad1b-7f33fd17b752" containerID="2da9ce0992b62b910dd3ba8257a33e9d0b87381b5a3ee4105132fced1543a049" exitCode=0 Apr 22 18:47:20.300779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:20.300682 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6nbf" event={"ID":"b919031c-5663-434c-ad1b-7f33fd17b752","Type":"ContainerDied","Data":"2da9ce0992b62b910dd3ba8257a33e9d0b87381b5a3ee4105132fced1543a049"} Apr 22 18:47:20.304339 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:20.304321 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 18:47:20.304747 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:20.304700 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" event={"ID":"e741af54-1487-45cb-b5ea-22577783221e","Type":"ContainerStarted","Data":"1e7eb7c4ce90e60c34ec4ba58364b0edf0b6234eff3a403bf8c2399b1e14dbe3"} Apr 22 18:47:20.304910 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:20.304891 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:20.342825 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:20.342775 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" podStartSLOduration=9.397302526 podStartE2EDuration="26.342760899s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="2026-04-22 18:46:56.802652774 +0000 UTC m=+3.251435084" lastFinishedPulling="2026-04-22 18:47:13.748111136 +0000 UTC m=+20.196893457" observedRunningTime="2026-04-22 18:47:20.34231877 +0000 UTC m=+26.791101103" watchObservedRunningTime="2026-04-22 18:47:20.342760899 +0000 UTC m=+26.791543228" Apr 22 18:47:20.625665 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:20.625416 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z69xm"] Apr 22 18:47:20.625803 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:20.625777 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:20.625884 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:20.625867 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:20.628507 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:20.628454 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fsnhs"] Apr 22 18:47:20.628597 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:20.628584 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:20.628673 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:20.628657 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:21.308210 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:21.308115 2579 generic.go:358] "Generic (PLEG): container finished" podID="b919031c-5663-434c-ad1b-7f33fd17b752" containerID="74f5086c9489ddf55769efe2cecd548527de76431a77fa9fb749829fa99c94ed" exitCode=0 Apr 22 18:47:21.308210 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:21.308194 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6nbf" event={"ID":"b919031c-5663-434c-ad1b-7f33fd17b752","Type":"ContainerDied","Data":"74f5086c9489ddf55769efe2cecd548527de76431a77fa9fb749829fa99c94ed"} Apr 22 18:47:21.308823 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:21.308426 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:22.160010 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:22.159977 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:22.160217 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:22.159977 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:22.160217 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:22.160124 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:22.160217 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:22.160170 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:23.811392 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:23.811353 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-j98vd" Apr 22 18:47:23.812056 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:23.811544 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:23.812526 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:23.812497 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-j98vd" Apr 22 18:47:24.160984 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:24.160904 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:24.161150 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:24.161034 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:24.161150 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:24.161122 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:24.161231 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:24.161209 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:25.767090 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:25.767051 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:47:25.767699 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:25.767322 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:25.782555 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:25.782526 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tcrqv" Apr 22 18:47:26.159676 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.159586 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:26.159676 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.159616 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:26.159976 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:26.159716 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsnhs" podUID="a85a3a29-77c8-4cb9-b029-9f13cfa79162" Apr 22 18:47:26.159976 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:26.159880 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:47:26.373143 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.373112 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-53.ec2.internal" event="NodeReady" Apr 22 18:47:26.373302 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.373269 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:47:26.416226 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.416095 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qd4bd"] Apr 22 18:47:26.446053 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.446022 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2qtxz"] Apr 22 18:47:26.446223 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.446202 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:26.448425 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.448254 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zg5rc\"" Apr 22 18:47:26.448425 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.448292 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:47:26.448425 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.448321 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:47:26.461380 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.461356 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qd4bd"] Apr 22 18:47:26.461380 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.461383 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2qtxz"] Apr 22 18:47:26.461565 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.461505 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:47:26.464389 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.464365 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rxjsb\"" Apr 22 18:47:26.464501 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.464410 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:47:26.464501 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.464374 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:47:26.464501 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.464442 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:47:26.580464 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.580430 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:26.580657 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.580518 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/493591d7-8b3a-41e8-8c8c-40d1e116007e-config-volume\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:26.580657 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.580546 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqjb6\" (UniqueName: \"kubernetes.io/projected/493591d7-8b3a-41e8-8c8c-40d1e116007e-kube-api-access-lqjb6\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:26.580657 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.580567 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/493591d7-8b3a-41e8-8c8c-40d1e116007e-tmp-dir\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:26.580657 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.580585 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb8nc\" (UniqueName: \"kubernetes.io/projected/3d74ab93-b985-4a88-882c-10c6a352ac61-kube-api-access-bb8nc\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:47:26.580824 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.580660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:47:26.681402 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.681317 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:26.681578 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.681409 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/493591d7-8b3a-41e8-8c8c-40d1e116007e-config-volume\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:26.681578 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.681437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqjb6\" (UniqueName: \"kubernetes.io/projected/493591d7-8b3a-41e8-8c8c-40d1e116007e-kube-api-access-lqjb6\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:26.681578 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.681461 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/493591d7-8b3a-41e8-8c8c-40d1e116007e-tmp-dir\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:26.681578 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.681498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bb8nc\" (UniqueName: \"kubernetes.io/projected/3d74ab93-b985-4a88-882c-10c6a352ac61-kube-api-access-bb8nc\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:47:26.681578 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:26.681511 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:26.681578 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.681535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:47:26.681870 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:26.681601 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls podName:493591d7-8b3a-41e8-8c8c-40d1e116007e nodeName:}" failed. No retries permitted until 2026-04-22 18:47:27.18157541 +0000 UTC m=+33.630357720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls") pod "dns-default-qd4bd" (UID: "493591d7-8b3a-41e8-8c8c-40d1e116007e") : secret "dns-default-metrics-tls" not found Apr 22 18:47:26.681870 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:26.681636 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:26.681870 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:26.681686 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert podName:3d74ab93-b985-4a88-882c-10c6a352ac61 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:27.181669029 +0000 UTC m=+33.630451359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert") pod "ingress-canary-2qtxz" (UID: "3d74ab93-b985-4a88-882c-10c6a352ac61") : secret "canary-serving-cert" not found Apr 22 18:47:26.682116 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.682087 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/493591d7-8b3a-41e8-8c8c-40d1e116007e-tmp-dir\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:26.682185 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.682134 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/493591d7-8b3a-41e8-8c8c-40d1e116007e-config-volume\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:26.692207 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.692039 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqjb6\" (UniqueName: \"kubernetes.io/projected/493591d7-8b3a-41e8-8c8c-40d1e116007e-kube-api-access-lqjb6\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:26.692207 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:26.692199 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb8nc\" (UniqueName: \"kubernetes.io/projected/3d74ab93-b985-4a88-882c-10c6a352ac61-kube-api-access-bb8nc\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:47:27.185768 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:27.185732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:47:27.185768 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:27.185779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:27.186187 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:27.185895 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:27.186187 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:27.185951 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert podName:3d74ab93-b985-4a88-882c-10c6a352ac61 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:28.185937272 +0000 UTC m=+34.634719585 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert") pod "ingress-canary-2qtxz" (UID: "3d74ab93-b985-4a88-882c-10c6a352ac61") : secret "canary-serving-cert" not found Apr 22 18:47:27.186187 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:27.185901 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:27.186187 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:27.186030 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls podName:493591d7-8b3a-41e8-8c8c-40d1e116007e nodeName:}" failed. No retries permitted until 2026-04-22 18:47:28.186018577 +0000 UTC m=+34.634800892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls") pod "dns-default-qd4bd" (UID: "493591d7-8b3a-41e8-8c8c-40d1e116007e") : secret "dns-default-metrics-tls" not found Apr 22 18:47:27.791107 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:27.791072 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:27.791303 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:27.791121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt5q\" (UniqueName: \"kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q\") pod \"network-check-target-fsnhs\" (UID: \"a85a3a29-77c8-4cb9-b029-9f13cfa79162\") " pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:27.791303 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:27.791228 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:27.791303 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:27.791240 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:27.791303 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:27.791255 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:27.791303 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:27.791267 2579 projected.go:194] Error preparing data for projected volume kube-api-access-vpt5q for pod openshift-network-diagnostics/network-check-target-fsnhs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:27.791303 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:27.791304 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs podName:89068fbb-3d09-49a2-a369-e83cb48aaf0f nodeName:}" failed. No retries permitted until 2026-04-22 18:47:59.791285099 +0000 UTC m=+66.240067423 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs") pod "network-metrics-daemon-z69xm" (UID: "89068fbb-3d09-49a2-a369-e83cb48aaf0f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:27.791513 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:27.791320 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q podName:a85a3a29-77c8-4cb9-b029-9f13cfa79162 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:59.791313192 +0000 UTC m=+66.240095504 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-vpt5q" (UniqueName: "kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q") pod "network-check-target-fsnhs" (UID: "a85a3a29-77c8-4cb9-b029-9f13cfa79162") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:28.160125 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:28.160036 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:28.160377 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:28.160220 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:28.162182 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:28.162157 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:28.163145 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:28.163125 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:28.163257 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:28.163158 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5ggfs\"" Apr 22 18:47:28.163257 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:28.163192 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-m94d4\"" Apr 22 18:47:28.163257 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:28.163195 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:28.194890 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:28.194856 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:28.195332 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:28.194952 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:47:28.195332 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:28.195003 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:28.195332 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:28.195064 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:28.195332 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:28.195071 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls podName:493591d7-8b3a-41e8-8c8c-40d1e116007e nodeName:}" failed. No retries permitted until 2026-04-22 18:47:30.195054924 +0000 UTC m=+36.643837233 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls") pod "dns-default-qd4bd" (UID: "493591d7-8b3a-41e8-8c8c-40d1e116007e") : secret "dns-default-metrics-tls" not found Apr 22 18:47:28.195332 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:28.195115 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert podName:3d74ab93-b985-4a88-882c-10c6a352ac61 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:30.195097931 +0000 UTC m=+36.643880251 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert") pod "ingress-canary-2qtxz" (UID: "3d74ab93-b985-4a88-882c-10c6a352ac61") : secret "canary-serving-cert" not found Apr 22 18:47:28.324691 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:28.324661 2579 generic.go:358] "Generic (PLEG): container finished" podID="b919031c-5663-434c-ad1b-7f33fd17b752" containerID="14e0aa2c2d30b51cdc4ef2556540132111f753d04411f5bd6303d6e44efd5468" exitCode=0 Apr 22 18:47:28.324892 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:28.324721 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6nbf" event={"ID":"b919031c-5663-434c-ad1b-7f33fd17b752","Type":"ContainerDied","Data":"14e0aa2c2d30b51cdc4ef2556540132111f753d04411f5bd6303d6e44efd5468"} Apr 22 18:47:29.330062 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:29.330027 2579 generic.go:358] "Generic (PLEG): container finished" podID="b919031c-5663-434c-ad1b-7f33fd17b752" containerID="b799d420756cc792a6a278a1da219f123b493782c5555f317f1f5d9f9ea1cc01" exitCode=0 Apr 22 18:47:29.330535 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:29.330093 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6nbf" event={"ID":"b919031c-5663-434c-ad1b-7f33fd17b752","Type":"ContainerDied","Data":"b799d420756cc792a6a278a1da219f123b493782c5555f317f1f5d9f9ea1cc01"} Apr 22 18:47:30.210759 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:30.210730 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:47:30.210891 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:30.210770 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:30.210891 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:30.210885 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:30.210969 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:30.210889 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:30.210969 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:30.210937 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls podName:493591d7-8b3a-41e8-8c8c-40d1e116007e nodeName:}" failed. No retries permitted until 2026-04-22 18:47:34.21092293 +0000 UTC m=+40.659705238 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls") pod "dns-default-qd4bd" (UID: "493591d7-8b3a-41e8-8c8c-40d1e116007e") : secret "dns-default-metrics-tls" not found Apr 22 18:47:30.210969 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:30.210949 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert podName:3d74ab93-b985-4a88-882c-10c6a352ac61 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:34.210943681 +0000 UTC m=+40.659725989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert") pod "ingress-canary-2qtxz" (UID: "3d74ab93-b985-4a88-882c-10c6a352ac61") : secret "canary-serving-cert" not found Apr 22 18:47:30.334589 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:30.334561 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6nbf" event={"ID":"b919031c-5663-434c-ad1b-7f33fd17b752","Type":"ContainerStarted","Data":"dd1788b95d959c69008f46822c82cc8deaf208e84b843a1aadb7ef31046a3a29"} Apr 22 18:47:34.239676 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:34.239633 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:34.240032 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:34.239773 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:47:34.240032 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:34.239781 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:34.240032 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:34.239841 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls podName:493591d7-8b3a-41e8-8c8c-40d1e116007e nodeName:}" failed. No retries permitted until 2026-04-22 18:47:42.23982623 +0000 UTC m=+48.688608538 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls") pod "dns-default-qd4bd" (UID: "493591d7-8b3a-41e8-8c8c-40d1e116007e") : secret "dns-default-metrics-tls" not found Apr 22 18:47:34.240032 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:34.239848 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:34.240032 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:34.239886 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert podName:3d74ab93-b985-4a88-882c-10c6a352ac61 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:42.239873218 +0000 UTC m=+48.688655526 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert") pod "ingress-canary-2qtxz" (UID: "3d74ab93-b985-4a88-882c-10c6a352ac61") : secret "canary-serving-cert" not found Apr 22 18:47:42.296831 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:42.296788 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:42.297342 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:42.296882 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:47:42.297342 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:42.296947 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:42.297342 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:42.296986 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:42.297342 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:42.297015 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls podName:493591d7-8b3a-41e8-8c8c-40d1e116007e nodeName:}" failed. No retries permitted until 2026-04-22 18:47:58.296995989 +0000 UTC m=+64.745778296 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls") pod "dns-default-qd4bd" (UID: "493591d7-8b3a-41e8-8c8c-40d1e116007e") : secret "dns-default-metrics-tls" not found Apr 22 18:47:42.297342 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:42.297038 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert podName:3d74ab93-b985-4a88-882c-10c6a352ac61 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:58.297021349 +0000 UTC m=+64.745803657 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert") pod "ingress-canary-2qtxz" (UID: "3d74ab93-b985-4a88-882c-10c6a352ac61") : secret "canary-serving-cert" not found Apr 22 18:47:43.724292 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:43.724237 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b6nbf" podStartSLOduration=19.224563445 podStartE2EDuration="49.724221576s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="2026-04-22 18:46:56.81108928 +0000 UTC m=+3.259871588" lastFinishedPulling="2026-04-22 18:47:27.310747411 +0000 UTC m=+33.759529719" observedRunningTime="2026-04-22 18:47:30.364039652 +0000 UTC m=+36.812821987" watchObservedRunningTime="2026-04-22 18:47:43.724221576 +0000 UTC m=+50.173003905" Apr 22 18:47:43.724844 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:43.724826 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99"] Apr 22 18:47:43.770241 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:43.770207 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99"] Apr 22 18:47:43.770451 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:43.770361 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99" Apr 22 18:47:43.772436 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:43.772411 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 18:47:43.772558 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:43.772418 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:47:43.773278 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:43.773260 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:47:43.773334 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:43.773296 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-p454j\"" Apr 22 18:47:43.773379 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:43.773335 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:47:43.909079 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:43.909041 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4l5k\" (UniqueName: \"kubernetes.io/projected/11369bda-b1b6-40aa-9ec2-8516909a087f-kube-api-access-d4l5k\") pod \"managed-serviceaccount-addon-agent-d9686bbc-gbr99\" (UID: \"11369bda-b1b6-40aa-9ec2-8516909a087f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99" Apr 22 18:47:43.909258 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:43.909106 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/11369bda-b1b6-40aa-9ec2-8516909a087f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d9686bbc-gbr99\" (UID: \"11369bda-b1b6-40aa-9ec2-8516909a087f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99" Apr 22 18:47:44.009835 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:44.009789 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4l5k\" (UniqueName: \"kubernetes.io/projected/11369bda-b1b6-40aa-9ec2-8516909a087f-kube-api-access-d4l5k\") pod \"managed-serviceaccount-addon-agent-d9686bbc-gbr99\" (UID: \"11369bda-b1b6-40aa-9ec2-8516909a087f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99" Apr 22 18:47:44.010012 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:44.009851 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/11369bda-b1b6-40aa-9ec2-8516909a087f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d9686bbc-gbr99\" (UID: \"11369bda-b1b6-40aa-9ec2-8516909a087f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99" Apr 22 18:47:44.013204 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:44.013179 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/11369bda-b1b6-40aa-9ec2-8516909a087f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d9686bbc-gbr99\" (UID: \"11369bda-b1b6-40aa-9ec2-8516909a087f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99" Apr 22 18:47:44.016543 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:44.016519 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4l5k\" (UniqueName: \"kubernetes.io/projected/11369bda-b1b6-40aa-9ec2-8516909a087f-kube-api-access-d4l5k\") pod \"managed-serviceaccount-addon-agent-d9686bbc-gbr99\" (UID: \"11369bda-b1b6-40aa-9ec2-8516909a087f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99" Apr 22 18:47:44.095366 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:44.095339 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99" Apr 22 18:47:44.268789 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:44.268723 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99"] Apr 22 18:47:44.361148 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:44.361113 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99" event={"ID":"11369bda-b1b6-40aa-9ec2-8516909a087f","Type":"ContainerStarted","Data":"836a4a47ab3819841b8a1028dec3997f92bc6a65686ce2fe0e054548718c6571"} Apr 22 18:47:48.370531 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:48.370477 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99" event={"ID":"11369bda-b1b6-40aa-9ec2-8516909a087f","Type":"ContainerStarted","Data":"a4ac528763b18c2566e4bc481dbf98dd23d71912977b4d088d74f280b5e2ab14"} Apr 22 18:47:48.386114 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:48.386065 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d9686bbc-gbr99" podStartSLOduration=2.037446491 podStartE2EDuration="5.386049412s" podCreationTimestamp="2026-04-22 18:47:43 +0000 UTC" firstStartedPulling="2026-04-22 18:47:44.274392375 +0000 UTC m=+50.723174686" lastFinishedPulling="2026-04-22 18:47:47.622995286 +0000 UTC m=+54.071777607" observedRunningTime="2026-04-22 18:47:48.384802427 +0000 UTC m=+54.833584759" watchObservedRunningTime="2026-04-22 18:47:48.386049412 +0000 UTC m=+54.834831758" Apr 22 18:47:58.307980 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:58.307934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:47:58.308399 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:58.307991 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:47:58.308399 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:58.308067 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:58.308399 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:58.308114 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:58.308399 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:58.308130 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert podName:3d74ab93-b985-4a88-882c-10c6a352ac61 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:30.308115072 +0000 UTC m=+96.756897380 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert") pod "ingress-canary-2qtxz" (UID: "3d74ab93-b985-4a88-882c-10c6a352ac61") : secret "canary-serving-cert" not found Apr 22 18:47:58.308399 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:58.308168 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls podName:493591d7-8b3a-41e8-8c8c-40d1e116007e nodeName:}" failed. No retries permitted until 2026-04-22 18:48:30.308153719 +0000 UTC m=+96.756936041 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls") pod "dns-default-qd4bd" (UID: "493591d7-8b3a-41e8-8c8c-40d1e116007e") : secret "dns-default-metrics-tls" not found Apr 22 18:47:59.816991 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:59.816944 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt5q\" (UniqueName: \"kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q\") pod \"network-check-target-fsnhs\" (UID: \"a85a3a29-77c8-4cb9-b029-9f13cfa79162\") " pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:59.817436 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:59.817028 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:47:59.819402 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:59.819379 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:59.819476 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:59.819417 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:59.828113 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:59.828094 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:47:59.828189 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:47:59.828151 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs podName:89068fbb-3d09-49a2-a369-e83cb48aaf0f nodeName:}" failed. No retries permitted until 2026-04-22 18:49:03.828135694 +0000 UTC m=+130.276918001 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs") pod "network-metrics-daemon-z69xm" (UID: "89068fbb-3d09-49a2-a369-e83cb48aaf0f") : secret "metrics-daemon-secret" not found Apr 22 18:47:59.829714 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:59.829698 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:59.841348 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:59.841325 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpt5q\" (UniqueName: \"kubernetes.io/projected/a85a3a29-77c8-4cb9-b029-9f13cfa79162-kube-api-access-vpt5q\") pod \"network-check-target-fsnhs\" (UID: \"a85a3a29-77c8-4cb9-b029-9f13cfa79162\") " pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:47:59.976833 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:59.976801 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5ggfs\"" Apr 22 18:47:59.985039 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:47:59.985015 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:48:00.097255 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:00.097137 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fsnhs"] Apr 22 18:48:00.100994 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:48:00.100969 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda85a3a29_77c8_4cb9_b029_9f13cfa79162.slice/crio-16bced7a4ace97031b80ddb90d59a68bd8668bedca24cf925346a724b7c66154 WatchSource:0}: Error finding container 16bced7a4ace97031b80ddb90d59a68bd8668bedca24cf925346a724b7c66154: Status 404 returned error can't find the container with id 16bced7a4ace97031b80ddb90d59a68bd8668bedca24cf925346a724b7c66154 Apr 22 18:48:00.396976 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:00.396889 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fsnhs" event={"ID":"a85a3a29-77c8-4cb9-b029-9f13cfa79162","Type":"ContainerStarted","Data":"16bced7a4ace97031b80ddb90d59a68bd8668bedca24cf925346a724b7c66154"} Apr 22 18:48:03.407743 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:03.407704 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fsnhs" event={"ID":"a85a3a29-77c8-4cb9-b029-9f13cfa79162","Type":"ContainerStarted","Data":"25e3c723b55443eb5b8bb7c2c54163293ec7c195052087881293be61a17ca981"} Apr 22 18:48:03.408126 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:03.407925 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:48:03.421673 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:03.421624 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fsnhs" podStartSLOduration=66.884819767 podStartE2EDuration="1m9.421608515s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="2026-04-22 18:48:00.103404124 +0000 UTC m=+66.552186436" lastFinishedPulling="2026-04-22 18:48:02.640192874 +0000 UTC m=+69.088975184" observedRunningTime="2026-04-22 18:48:03.420967938 +0000 UTC m=+69.869750267" watchObservedRunningTime="2026-04-22 18:48:03.421608515 +0000 UTC m=+69.870390841" Apr 22 18:48:30.334544 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:30.334467 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:48:30.334950 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:30.334569 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:48:30.334950 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:30.334640 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:48:30.334950 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:30.334666 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:48:30.334950 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:30.334737 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert podName:3d74ab93-b985-4a88-882c-10c6a352ac61 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:34.334714449 +0000 UTC m=+160.783496757 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert") pod "ingress-canary-2qtxz" (UID: "3d74ab93-b985-4a88-882c-10c6a352ac61") : secret "canary-serving-cert" not found Apr 22 18:48:30.334950 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:30.334758 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls podName:493591d7-8b3a-41e8-8c8c-40d1e116007e nodeName:}" failed. No retries permitted until 2026-04-22 18:49:34.334749173 +0000 UTC m=+160.783531484 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls") pod "dns-default-qd4bd" (UID: "493591d7-8b3a-41e8-8c8c-40d1e116007e") : secret "dns-default-metrics-tls" not found Apr 22 18:48:34.412332 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:34.412292 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fsnhs" Apr 22 18:48:52.255961 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.255928 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7cf88557b7-mcsdx"] Apr 22 18:48:52.258772 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.258754 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.260864 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.260842 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-c9m64\"" Apr 22 18:48:52.260987 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.260877 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:48:52.261048 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.261019 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:48:52.261329 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.261313 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:48:52.265794 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.265435 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:48:52.269067 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.269048 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7cf88557b7-mcsdx"] Apr 22 18:48:52.364321 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.364276 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp"] Apr 22 18:48:52.367084 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.367064 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" Apr 22 18:48:52.370801 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.370776 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-7m566\"" Apr 22 18:48:52.370913 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.370871 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 18:48:52.370913 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.370880 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:52.371722 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.371707 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 18:48:52.375816 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.375795 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:52.382344 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.382320 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvhd6\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-kube-api-access-kvhd6\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.382451 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.382374 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-certificates\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.382451 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.382393 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.382451 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.382430 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ceb18df5-258c-4b70-9799-2da231a29e8f-trusted-ca\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.382567 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.382519 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ceb18df5-258c-4b70-9799-2da231a29e8f-ca-trust-extracted\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.382567 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.382540 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ceb18df5-258c-4b70-9799-2da231a29e8f-installation-pull-secrets\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.382567 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.382562 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ceb18df5-258c-4b70-9799-2da231a29e8f-image-registry-private-configuration\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.382658 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.382579 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-bound-sa-token\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.391242 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.391204 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-66ccb77bfb-zczmp"] Apr 22 18:48:52.393860 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.393846 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k4nph"] Apr 22 18:48:52.394004 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.393988 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.396378 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.396363 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k4nph" Apr 22 18:48:52.398736 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.398718 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 18:48:52.399679 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.399625 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 18:48:52.399787 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.399710 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:48:52.400977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.400958 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 18:48:52.401081 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.401053 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:52.401150 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.400968 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 18:48:52.401205 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.401164 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:52.401256 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.401208 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:48:52.401529 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.401511 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-w7g28\"" Apr 22 18:48:52.402532 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.402515 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp"] Apr 22 18:48:52.404406 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.404389 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-t86bh\"" Apr 22 18:48:52.409200 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.409114 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-66ccb77bfb-zczmp"] Apr 22 18:48:52.409906 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.409887 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k4nph"] Apr 22 18:48:52.482974 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.482940 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-certificates\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.482974 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.482980 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcvkt\" (UniqueName: \"kubernetes.io/projected/3ea60cc1-e044-48db-8411-8f2ee6f65e26-kube-api-access-dcvkt\") pod \"volume-data-source-validator-7c6cbb6c87-k4nph\" (UID: \"3ea60cc1-e044-48db-8411-8f2ee6f65e26\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k4nph" Apr 22 18:48:52.483227 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.482999 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a74fd54d-3421-40fe-967e-0f3839d3664e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-x6vrp\" (UID: \"a74fd54d-3421-40fe-967e-0f3839d3664e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" Apr 22 18:48:52.483227 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.483227 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483134 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.483227 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483153 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ceb18df5-258c-4b70-9799-2da231a29e8f-trusted-ca\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.483227 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ceb18df5-258c-4b70-9799-2da231a29e8f-ca-trust-extracted\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.483478 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:52.483229 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:52.483478 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:52.483249 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cf88557b7-mcsdx: secret "image-registry-tls" not found Apr 22 18:48:52.483478 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:52.483300 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls podName:ceb18df5-258c-4b70-9799-2da231a29e8f nodeName:}" failed. No retries permitted until 2026-04-22 18:48:52.983283081 +0000 UTC m=+119.432065406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls") pod "image-registry-7cf88557b7-mcsdx" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f") : secret "image-registry-tls" not found Apr 22 18:48:52.483478 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483229 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ceb18df5-258c-4b70-9799-2da231a29e8f-installation-pull-secrets\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.483478 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483414 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-stats-auth\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.483478 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483447 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.483813 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483502 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ceb18df5-258c-4b70-9799-2da231a29e8f-image-registry-private-configuration\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.483813 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-bound-sa-token\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.483813 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483583 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ceb18df5-258c-4b70-9799-2da231a29e8f-ca-trust-extracted\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.483813 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483583 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-default-certificate\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.483813 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483640 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8fpl\" (UniqueName: \"kubernetes.io/projected/7033a011-280c-43ac-a654-1bfbeeac2a87-kube-api-access-b8fpl\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.483813 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483687 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-certificates\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.483813 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483690 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-554sm\" (UniqueName: \"kubernetes.io/projected/a74fd54d-3421-40fe-967e-0f3839d3664e-kube-api-access-554sm\") pod \"kube-storage-version-migrator-operator-6769c5d45-x6vrp\" (UID: \"a74fd54d-3421-40fe-967e-0f3839d3664e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" Apr 22 18:48:52.483813 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a74fd54d-3421-40fe-967e-0f3839d3664e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-x6vrp\" (UID: \"a74fd54d-3421-40fe-967e-0f3839d3664e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" Apr 22 18:48:52.483813 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.483800 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvhd6\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-kube-api-access-kvhd6\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.487824 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.486199 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ceb18df5-258c-4b70-9799-2da231a29e8f-trusted-ca\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.487983 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.487963 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ceb18df5-258c-4b70-9799-2da231a29e8f-installation-pull-secrets\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.488581 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.488556 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ceb18df5-258c-4b70-9799-2da231a29e8f-image-registry-private-configuration\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.494100 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.494071 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvhd6\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-kube-api-access-kvhd6\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.494534 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.494517 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-bound-sa-token\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.585605 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.585507 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-stats-auth\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.585772 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.585638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.585772 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.585660 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-default-certificate\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.585772 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.585676 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8fpl\" (UniqueName: \"kubernetes.io/projected/7033a011-280c-43ac-a654-1bfbeeac2a87-kube-api-access-b8fpl\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.585772 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.585706 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-554sm\" (UniqueName: \"kubernetes.io/projected/a74fd54d-3421-40fe-967e-0f3839d3664e-kube-api-access-554sm\") pod \"kube-storage-version-migrator-operator-6769c5d45-x6vrp\" (UID: \"a74fd54d-3421-40fe-967e-0f3839d3664e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" Apr 22 18:48:52.585772 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.585725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a74fd54d-3421-40fe-967e-0f3839d3664e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-x6vrp\" (UID: \"a74fd54d-3421-40fe-967e-0f3839d3664e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" Apr 22 18:48:52.586010 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:52.585767 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:52.586010 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:52.585857 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs podName:7033a011-280c-43ac-a654-1bfbeeac2a87 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:53.085836872 +0000 UTC m=+119.534619200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs") pod "router-default-66ccb77bfb-zczmp" (UID: "7033a011-280c-43ac-a654-1bfbeeac2a87") : secret "router-metrics-certs-default" not found Apr 22 18:48:52.586010 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.585940 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcvkt\" (UniqueName: \"kubernetes.io/projected/3ea60cc1-e044-48db-8411-8f2ee6f65e26-kube-api-access-dcvkt\") pod \"volume-data-source-validator-7c6cbb6c87-k4nph\" (UID: \"3ea60cc1-e044-48db-8411-8f2ee6f65e26\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k4nph" Apr 22 18:48:52.586010 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.585974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a74fd54d-3421-40fe-967e-0f3839d3664e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-x6vrp\" (UID: \"a74fd54d-3421-40fe-967e-0f3839d3664e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" Apr 22 18:48:52.586010 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.585998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.586259 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:52.586149 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle podName:7033a011-280c-43ac-a654-1bfbeeac2a87 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:53.08613481 +0000 UTC m=+119.534917122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle") pod "router-default-66ccb77bfb-zczmp" (UID: "7033a011-280c-43ac-a654-1bfbeeac2a87") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:52.586578 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.586555 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a74fd54d-3421-40fe-967e-0f3839d3664e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-x6vrp\" (UID: \"a74fd54d-3421-40fe-967e-0f3839d3664e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" Apr 22 18:48:52.588000 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.587977 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a74fd54d-3421-40fe-967e-0f3839d3664e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-x6vrp\" (UID: \"a74fd54d-3421-40fe-967e-0f3839d3664e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" Apr 22 18:48:52.588180 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.588160 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-default-certificate\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.588265 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.588241 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-stats-auth\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.593811 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.593789 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcvkt\" (UniqueName: \"kubernetes.io/projected/3ea60cc1-e044-48db-8411-8f2ee6f65e26-kube-api-access-dcvkt\") pod \"volume-data-source-validator-7c6cbb6c87-k4nph\" (UID: \"3ea60cc1-e044-48db-8411-8f2ee6f65e26\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k4nph" Apr 22 18:48:52.594525 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.594472 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8fpl\" (UniqueName: \"kubernetes.io/projected/7033a011-280c-43ac-a654-1bfbeeac2a87-kube-api-access-b8fpl\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:52.594525 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.594479 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-554sm\" (UniqueName: \"kubernetes.io/projected/a74fd54d-3421-40fe-967e-0f3839d3664e-kube-api-access-554sm\") pod \"kube-storage-version-migrator-operator-6769c5d45-x6vrp\" (UID: \"a74fd54d-3421-40fe-967e-0f3839d3664e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" Apr 22 18:48:52.675253 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.675219 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" Apr 22 18:48:52.710355 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.710320 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k4nph" Apr 22 18:48:52.791840 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.791811 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp"] Apr 22 18:48:52.796502 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:48:52.796456 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda74fd54d_3421_40fe_967e_0f3839d3664e.slice/crio-a5d1c0e796ec2c4fcc38bb8f8efd31b4eb4863cecb9c2e756fc363eef169464e WatchSource:0}: Error finding container a5d1c0e796ec2c4fcc38bb8f8efd31b4eb4863cecb9c2e756fc363eef169464e: Status 404 returned error can't find the container with id a5d1c0e796ec2c4fcc38bb8f8efd31b4eb4863cecb9c2e756fc363eef169464e Apr 22 18:48:52.831947 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.831910 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k4nph"] Apr 22 18:48:52.834378 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:48:52.834349 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ea60cc1_e044_48db_8411_8f2ee6f65e26.slice/crio-85a447c16358578286c58ab6bb44b7cdafaee9e5901ea06d63fdaf80836f2da3 WatchSource:0}: Error finding container 85a447c16358578286c58ab6bb44b7cdafaee9e5901ea06d63fdaf80836f2da3: Status 404 returned error can't find the container with id 85a447c16358578286c58ab6bb44b7cdafaee9e5901ea06d63fdaf80836f2da3 Apr 22 18:48:52.989032 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:52.988995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:52.989214 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:52.989143 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:52.989214 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:52.989164 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cf88557b7-mcsdx: secret "image-registry-tls" not found Apr 22 18:48:52.989290 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:52.989218 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls podName:ceb18df5-258c-4b70-9799-2da231a29e8f nodeName:}" failed. No retries permitted until 2026-04-22 18:48:53.989201736 +0000 UTC m=+120.437984044 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls") pod "image-registry-7cf88557b7-mcsdx" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f") : secret "image-registry-tls" not found Apr 22 18:48:53.089583 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:53.089474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:53.089732 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:53.089606 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:53.089732 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:53.089637 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:53.089732 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:53.089711 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle podName:7033a011-280c-43ac-a654-1bfbeeac2a87 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:54.089692212 +0000 UTC m=+120.538474534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle") pod "router-default-66ccb77bfb-zczmp" (UID: "7033a011-280c-43ac-a654-1bfbeeac2a87") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:53.089732 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:53.089729 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs podName:7033a011-280c-43ac-a654-1bfbeeac2a87 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:54.089721328 +0000 UTC m=+120.538503636 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs") pod "router-default-66ccb77bfb-zczmp" (UID: "7033a011-280c-43ac-a654-1bfbeeac2a87") : secret "router-metrics-certs-default" not found Apr 22 18:48:53.505541 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:53.505501 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" event={"ID":"a74fd54d-3421-40fe-967e-0f3839d3664e","Type":"ContainerStarted","Data":"a5d1c0e796ec2c4fcc38bb8f8efd31b4eb4863cecb9c2e756fc363eef169464e"} Apr 22 18:48:53.506815 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:53.506786 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k4nph" event={"ID":"3ea60cc1-e044-48db-8411-8f2ee6f65e26","Type":"ContainerStarted","Data":"85a447c16358578286c58ab6bb44b7cdafaee9e5901ea06d63fdaf80836f2da3"} Apr 22 18:48:53.996501 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:53.996453 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:53.996708 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:53.996683 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:53.996760 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:53.996714 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cf88557b7-mcsdx: secret "image-registry-tls" not found Apr 22 18:48:53.996804 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:53.996783 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls podName:ceb18df5-258c-4b70-9799-2da231a29e8f nodeName:}" failed. No retries permitted until 2026-04-22 18:48:55.996760528 +0000 UTC m=+122.445542842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls") pod "image-registry-7cf88557b7-mcsdx" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f") : secret "image-registry-tls" not found Apr 22 18:48:54.096841 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.096803 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:54.096998 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.096857 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:54.096998 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:54.096963 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle podName:7033a011-280c-43ac-a654-1bfbeeac2a87 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:56.096945055 +0000 UTC m=+122.545727367 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle") pod "router-default-66ccb77bfb-zczmp" (UID: "7033a011-280c-43ac-a654-1bfbeeac2a87") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:54.096998 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:54.096961 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:54.096998 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:54.096995 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs podName:7033a011-280c-43ac-a654-1bfbeeac2a87 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:56.096987312 +0000 UTC m=+122.545769620 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs") pod "router-default-66ccb77bfb-zczmp" (UID: "7033a011-280c-43ac-a654-1bfbeeac2a87") : secret "router-metrics-certs-default" not found Apr 22 18:48:54.370436 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.370331 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-dr26w"] Apr 22 18:48:54.373548 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.373519 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:48:54.375651 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.375626 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 18:48:54.375785 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.375724 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 18:48:54.375785 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.375755 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6wtrx\"" Apr 22 18:48:54.381052 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.381005 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-dr26w"] Apr 22 18:48:54.500523 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.500463 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dr26w\" (UID: \"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:48:54.500691 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.500572 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-dr26w\" (UID: \"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:48:54.601940 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.601896 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dr26w\" (UID: \"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:48:54.602440 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.601964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-dr26w\" (UID: \"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:48:54.602440 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:54.602062 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:48:54.602440 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:54.602143 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert podName:cc5b6d2f-45de-45a1-9bb3-d90f5940d33b nodeName:}" failed. No retries permitted until 2026-04-22 18:48:55.102120817 +0000 UTC m=+121.550903135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dr26w" (UID: "cc5b6d2f-45de-45a1-9bb3-d90f5940d33b") : secret "networking-console-plugin-cert" not found Apr 22 18:48:54.602721 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:54.602701 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-dr26w\" (UID: \"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:48:55.106594 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:55.106551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dr26w\" (UID: \"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:48:55.106774 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:55.106706 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:48:55.106774 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:55.106772 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert podName:cc5b6d2f-45de-45a1-9bb3-d90f5940d33b nodeName:}" failed. No retries permitted until 2026-04-22 18:48:56.106756452 +0000 UTC m=+122.555538760 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dr26w" (UID: "cc5b6d2f-45de-45a1-9bb3-d90f5940d33b") : secret "networking-console-plugin-cert" not found Apr 22 18:48:55.512019 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:55.511981 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" event={"ID":"a74fd54d-3421-40fe-967e-0f3839d3664e","Type":"ContainerStarted","Data":"a155f3913cbc9dbd64c981675caf479b286fb98725c71112e23e968b4308d73b"} Apr 22 18:48:55.513221 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:55.513193 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k4nph" event={"ID":"3ea60cc1-e044-48db-8411-8f2ee6f65e26","Type":"ContainerStarted","Data":"be25264e096a6e420484cfefcb9510dfdd65ae106d0fd1e84697710ca078f9d2"} Apr 22 18:48:55.525785 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:55.525744 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" podStartSLOduration=1.354613889 podStartE2EDuration="3.525731557s" podCreationTimestamp="2026-04-22 18:48:52 +0000 UTC" firstStartedPulling="2026-04-22 18:48:52.798352674 +0000 UTC m=+119.247134983" lastFinishedPulling="2026-04-22 18:48:54.969470335 +0000 UTC m=+121.418252651" observedRunningTime="2026-04-22 18:48:55.525085763 +0000 UTC m=+121.973868094" watchObservedRunningTime="2026-04-22 18:48:55.525731557 +0000 UTC m=+121.974513913" Apr 22 18:48:55.536978 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:55.536926 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k4nph" podStartSLOduration=1.445294351 podStartE2EDuration="3.536912631s" podCreationTimestamp="2026-04-22 18:48:52 +0000 UTC" firstStartedPulling="2026-04-22 18:48:52.83609113 +0000 UTC m=+119.284873438" lastFinishedPulling="2026-04-22 18:48:54.927709408 +0000 UTC m=+121.376491718" observedRunningTime="2026-04-22 18:48:55.536795324 +0000 UTC m=+121.985577655" watchObservedRunningTime="2026-04-22 18:48:55.536912631 +0000 UTC m=+121.985694962" Apr 22 18:48:56.011765 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:56.011723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:48:56.012161 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:56.011874 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:56.012161 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:56.011898 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cf88557b7-mcsdx: secret "image-registry-tls" not found Apr 22 18:48:56.012161 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:56.011954 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls podName:ceb18df5-258c-4b70-9799-2da231a29e8f nodeName:}" failed. No retries permitted until 2026-04-22 18:49:00.011937118 +0000 UTC m=+126.460719426 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls") pod "image-registry-7cf88557b7-mcsdx" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f") : secret "image-registry-tls" not found Apr 22 18:48:56.112918 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:56.112867 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:56.113118 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:56.112921 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dr26w\" (UID: \"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:48:56.113118 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:56.113033 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:56.113118 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:56.113108 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs podName:7033a011-280c-43ac-a654-1bfbeeac2a87 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:00.11308605 +0000 UTC m=+126.561868376 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs") pod "router-default-66ccb77bfb-zczmp" (UID: "7033a011-280c-43ac-a654-1bfbeeac2a87") : secret "router-metrics-certs-default" not found Apr 22 18:48:56.113290 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:56.113036 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:48:56.113290 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:56.113141 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle podName:7033a011-280c-43ac-a654-1bfbeeac2a87 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:00.113125582 +0000 UTC m=+126.561907909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle") pod "router-default-66ccb77bfb-zczmp" (UID: "7033a011-280c-43ac-a654-1bfbeeac2a87") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:56.113290 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:56.113269 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:48:56.113422 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:56.113359 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert podName:cc5b6d2f-45de-45a1-9bb3-d90f5940d33b nodeName:}" failed. No retries permitted until 2026-04-22 18:48:58.113329915 +0000 UTC m=+124.562112241 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dr26w" (UID: "cc5b6d2f-45de-45a1-9bb3-d90f5940d33b") : secret "networking-console-plugin-cert" not found Apr 22 18:48:58.128774 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:58.128734 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dr26w\" (UID: \"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:48:58.129171 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:58.128887 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:48:58.129171 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:48:58.128955 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert podName:cc5b6d2f-45de-45a1-9bb3-d90f5940d33b nodeName:}" failed. No retries permitted until 2026-04-22 18:49:02.128940469 +0000 UTC m=+128.577722780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dr26w" (UID: "cc5b6d2f-45de-45a1-9bb3-d90f5940d33b") : secret "networking-console-plugin-cert" not found Apr 22 18:48:59.308716 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:48:59.308689 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d76kb_96372a97-6aae-4e84-bcb3-d4446e5315ed/dns-node-resolver/0.log" Apr 22 18:49:00.046118 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:00.046081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:49:00.046301 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:00.046189 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:49:00.046301 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:00.046202 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cf88557b7-mcsdx: secret "image-registry-tls" not found Apr 22 18:49:00.046301 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:00.046249 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls podName:ceb18df5-258c-4b70-9799-2da231a29e8f nodeName:}" failed. No retries permitted until 2026-04-22 18:49:08.046234404 +0000 UTC m=+134.495016712 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls") pod "image-registry-7cf88557b7-mcsdx" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f") : secret "image-registry-tls" not found Apr 22 18:49:00.146779 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:00.146736 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:49:00.146932 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:00.146813 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:49:00.146932 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:00.146923 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle podName:7033a011-280c-43ac-a654-1bfbeeac2a87 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:08.146901859 +0000 UTC m=+134.595684220 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle") pod "router-default-66ccb77bfb-zczmp" (UID: "7033a011-280c-43ac-a654-1bfbeeac2a87") : configmap references non-existent config key: service-ca.crt Apr 22 18:49:00.147009 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:00.146946 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:49:00.147009 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:00.146995 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs podName:7033a011-280c-43ac-a654-1bfbeeac2a87 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:08.146982904 +0000 UTC m=+134.595765216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs") pod "router-default-66ccb77bfb-zczmp" (UID: "7033a011-280c-43ac-a654-1bfbeeac2a87") : secret "router-metrics-certs-default" not found Apr 22 18:49:00.309356 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:00.309281 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-whszt_d751b5d6-3a64-4cdd-addf-8bc01e1ef230/node-ca/0.log" Apr 22 18:49:01.910748 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:01.910713 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-x6vrp_a74fd54d-3421-40fe-967e-0f3839d3664e/kube-storage-version-migrator-operator/0.log" Apr 22 18:49:02.160731 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:02.160696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dr26w\" (UID: \"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:49:02.160932 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:02.160822 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:49:02.160932 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:02.160878 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert podName:cc5b6d2f-45de-45a1-9bb3-d90f5940d33b nodeName:}" failed. No retries permitted until 2026-04-22 18:49:10.160862975 +0000 UTC m=+136.609645282 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dr26w" (UID: "cc5b6d2f-45de-45a1-9bb3-d90f5940d33b") : secret "networking-console-plugin-cert" not found Apr 22 18:49:03.872986 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:03.872941 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:49:03.873386 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:03.873097 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:49:03.873386 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:03.873163 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs podName:89068fbb-3d09-49a2-a369-e83cb48aaf0f nodeName:}" failed. No retries permitted until 2026-04-22 18:51:05.873147317 +0000 UTC m=+252.321929644 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs") pod "network-metrics-daemon-z69xm" (UID: "89068fbb-3d09-49a2-a369-e83cb48aaf0f") : secret "metrics-daemon-secret" not found Apr 22 18:49:08.106224 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.106182 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:49:08.108998 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.108975 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls\") pod \"image-registry-7cf88557b7-mcsdx\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:49:08.170116 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.170090 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-c9m64\"" Apr 22 18:49:08.178804 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.178782 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:49:08.206855 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.206822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:49:08.206996 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.206868 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:49:08.207466 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.207448 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7033a011-280c-43ac-a654-1bfbeeac2a87-service-ca-bundle\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:49:08.209168 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.209140 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7033a011-280c-43ac-a654-1bfbeeac2a87-metrics-certs\") pod \"router-default-66ccb77bfb-zczmp\" (UID: \"7033a011-280c-43ac-a654-1bfbeeac2a87\") " pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:49:08.295711 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.295680 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7cf88557b7-mcsdx"] Apr 22 18:49:08.299096 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:49:08.299070 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceb18df5_258c_4b70_9799_2da231a29e8f.slice/crio-f4a8323910d04baef32f444e7125df5b552238cc61e3a7aa8ef619e05a2b486b WatchSource:0}: Error finding container f4a8323910d04baef32f444e7125df5b552238cc61e3a7aa8ef619e05a2b486b: Status 404 returned error can't find the container with id f4a8323910d04baef32f444e7125df5b552238cc61e3a7aa8ef619e05a2b486b Apr 22 18:49:08.306401 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.306383 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-w7g28\"" Apr 22 18:49:08.315029 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.315010 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:49:08.435324 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.435292 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-66ccb77bfb-zczmp"] Apr 22 18:49:08.438357 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:49:08.438331 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7033a011_280c_43ac_a654_1bfbeeac2a87.slice/crio-aed1bd15997ac5c1bfea1ba8dff7ae02e3d6cebfddd6a7f58ff91f02c172b871 WatchSource:0}: Error finding container aed1bd15997ac5c1bfea1ba8dff7ae02e3d6cebfddd6a7f58ff91f02c172b871: Status 404 returned error can't find the container with id aed1bd15997ac5c1bfea1ba8dff7ae02e3d6cebfddd6a7f58ff91f02c172b871 Apr 22 18:49:08.542172 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.542140 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" event={"ID":"ceb18df5-258c-4b70-9799-2da231a29e8f","Type":"ContainerStarted","Data":"0284dbfa08ce0a8cfcfe69f2fb18e784b714ed18effbd0763786fb5f2d7088c2"} Apr 22 18:49:08.542172 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.542177 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" event={"ID":"ceb18df5-258c-4b70-9799-2da231a29e8f","Type":"ContainerStarted","Data":"f4a8323910d04baef32f444e7125df5b552238cc61e3a7aa8ef619e05a2b486b"} Apr 22 18:49:08.542440 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.542308 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:49:08.543659 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.543630 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-66ccb77bfb-zczmp" event={"ID":"7033a011-280c-43ac-a654-1bfbeeac2a87","Type":"ContainerStarted","Data":"24641081368cf48768616f909ff6a61a5832115b34d244d86925ff8688d3de50"} Apr 22 18:49:08.543783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.543663 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-66ccb77bfb-zczmp" event={"ID":"7033a011-280c-43ac-a654-1bfbeeac2a87","Type":"ContainerStarted","Data":"aed1bd15997ac5c1bfea1ba8dff7ae02e3d6cebfddd6a7f58ff91f02c172b871"} Apr 22 18:49:08.564287 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.564231 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" podStartSLOduration=16.564214639 podStartE2EDuration="16.564214639s" podCreationTimestamp="2026-04-22 18:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:08.563398053 +0000 UTC m=+135.012180386" watchObservedRunningTime="2026-04-22 18:49:08.564214639 +0000 UTC m=+135.012996969" Apr 22 18:49:08.580349 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:08.580287 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-66ccb77bfb-zczmp" podStartSLOduration=16.580268697 podStartE2EDuration="16.580268697s" podCreationTimestamp="2026-04-22 18:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:08.578869493 +0000 UTC m=+135.027651824" watchObservedRunningTime="2026-04-22 18:49:08.580268697 +0000 UTC m=+135.029051028" Apr 22 18:49:09.315559 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:09.315527 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:49:09.318123 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:09.318100 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:49:09.546374 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:09.546332 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:49:09.547736 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:09.547711 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-66ccb77bfb-zczmp" Apr 22 18:49:10.221843 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:10.221798 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dr26w\" (UID: \"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:49:10.224165 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:10.224144 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc5b6d2f-45de-45a1-9bb3-d90f5940d33b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dr26w\" (UID: \"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:49:10.283552 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:10.283512 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" Apr 22 18:49:10.404813 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:10.404735 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-dr26w"] Apr 22 18:49:10.407185 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:49:10.407153 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc5b6d2f_45de_45a1_9bb3_d90f5940d33b.slice/crio-c7918c94058a675a731272cf316221d926dd1d6a4c6108252303b2e96ad66698 WatchSource:0}: Error finding container c7918c94058a675a731272cf316221d926dd1d6a4c6108252303b2e96ad66698: Status 404 returned error can't find the container with id c7918c94058a675a731272cf316221d926dd1d6a4c6108252303b2e96ad66698 Apr 22 18:49:10.549479 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:10.549437 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" event={"ID":"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b","Type":"ContainerStarted","Data":"c7918c94058a675a731272cf316221d926dd1d6a4c6108252303b2e96ad66698"} Apr 22 18:49:12.555710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:12.555676 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" event={"ID":"cc5b6d2f-45de-45a1-9bb3-d90f5940d33b","Type":"ContainerStarted","Data":"5413f899e8ecc5afbfc1323ddb549aac70e0f91221189080d3027a38d76526e0"} Apr 22 18:49:17.467107 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.467054 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dr26w" podStartSLOduration=22.039368495 podStartE2EDuration="23.467039517s" podCreationTimestamp="2026-04-22 18:48:54 +0000 UTC" firstStartedPulling="2026-04-22 18:49:10.409173948 +0000 UTC m=+136.857956256" lastFinishedPulling="2026-04-22 18:49:11.83684497 +0000 UTC m=+138.285627278" observedRunningTime="2026-04-22 18:49:12.568356247 +0000 UTC m=+139.017138577" watchObservedRunningTime="2026-04-22 18:49:17.467039517 +0000 UTC m=+143.915821848" Apr 22 18:49:17.467751 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.467730 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7cf88557b7-mcsdx"] Apr 22 18:49:17.485929 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.485898 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56f5ddf4f8-ww8sg"] Apr 22 18:49:17.490610 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.490588 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.499458 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.499434 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-s8zf7"] Apr 22 18:49:17.502615 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.502595 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bvh7k"] Apr 22 18:49:17.502781 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.502764 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-s8zf7" Apr 22 18:49:17.505498 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.505444 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-qbgrl\"" Apr 22 18:49:17.505646 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.505621 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:49:17.505767 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.505644 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:49:17.505922 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.505904 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56f5ddf4f8-ww8sg"] Apr 22 18:49:17.506033 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.506019 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.508395 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.508376 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:49:17.508764 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.508420 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:49:17.509075 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.509055 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:49:17.509178 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.509117 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zj8q7\"" Apr 22 18:49:17.509178 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.509059 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:49:17.512517 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.512473 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bvh7k"] Apr 22 18:49:17.518698 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.518634 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-s8zf7"] Apr 22 18:49:17.577091 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.577055 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b93cb4a4-7991-414d-9679-eaf8caa890ce-registry-certificates\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.577278 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.577118 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b93cb4a4-7991-414d-9679-eaf8caa890ce-bound-sa-token\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.577278 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.577180 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdvj2\" (UniqueName: \"kubernetes.io/projected/b93cb4a4-7991-414d-9679-eaf8caa890ce-kube-api-access-gdvj2\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.577278 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.577215 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b93cb4a4-7991-414d-9679-eaf8caa890ce-image-registry-private-configuration\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.577398 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.577296 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b93cb4a4-7991-414d-9679-eaf8caa890ce-trusted-ca\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.577398 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.577321 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b93cb4a4-7991-414d-9679-eaf8caa890ce-ca-trust-extracted\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.577398 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.577349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b93cb4a4-7991-414d-9679-eaf8caa890ce-registry-tls\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.577398 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.577387 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b93cb4a4-7991-414d-9679-eaf8caa890ce-installation-pull-secrets\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.677918 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.677878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b93cb4a4-7991-414d-9679-eaf8caa890ce-trusted-ca\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.677918 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.677921 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bc3d3928-7592-4441-a0c2-d7041e583080-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.678169 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.677945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b93cb4a4-7991-414d-9679-eaf8caa890ce-ca-trust-extracted\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.678169 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.677961 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bc3d3928-7592-4441-a0c2-d7041e583080-data-volume\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.678169 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.677991 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b93cb4a4-7991-414d-9679-eaf8caa890ce-registry-tls\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.678169 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.678010 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b93cb4a4-7991-414d-9679-eaf8caa890ce-installation-pull-secrets\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.678427 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.678400 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b93cb4a4-7991-414d-9679-eaf8caa890ce-registry-certificates\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.678571 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.678555 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzlk4\" (UniqueName: \"kubernetes.io/projected/8ce5885b-3d73-4c7a-9648-01b63671ceaa-kube-api-access-vzlk4\") pod \"downloads-6bcc868b7-s8zf7\" (UID: \"8ce5885b-3d73-4c7a-9648-01b63671ceaa\") " pod="openshift-console/downloads-6bcc868b7-s8zf7" Apr 22 18:49:17.678638 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.678597 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bc3d3928-7592-4441-a0c2-d7041e583080-crio-socket\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.678695 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.678652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bc3d3928-7592-4441-a0c2-d7041e583080-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.678695 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.678654 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b93cb4a4-7991-414d-9679-eaf8caa890ce-ca-trust-extracted\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.678796 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.678730 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9d8c\" (UniqueName: \"kubernetes.io/projected/bc3d3928-7592-4441-a0c2-d7041e583080-kube-api-access-w9d8c\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.678796 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.678782 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b93cb4a4-7991-414d-9679-eaf8caa890ce-bound-sa-token\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.678894 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.678828 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdvj2\" (UniqueName: \"kubernetes.io/projected/b93cb4a4-7991-414d-9679-eaf8caa890ce-kube-api-access-gdvj2\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.678894 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.678862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b93cb4a4-7991-414d-9679-eaf8caa890ce-image-registry-private-configuration\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.679356 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.679328 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b93cb4a4-7991-414d-9679-eaf8caa890ce-trusted-ca\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.680962 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.680939 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b93cb4a4-7991-414d-9679-eaf8caa890ce-registry-certificates\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.682093 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.681409 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b93cb4a4-7991-414d-9679-eaf8caa890ce-installation-pull-secrets\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.683730 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.682670 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b93cb4a4-7991-414d-9679-eaf8caa890ce-image-registry-private-configuration\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.683730 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.682943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b93cb4a4-7991-414d-9679-eaf8caa890ce-registry-tls\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.689327 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.689304 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b93cb4a4-7991-414d-9679-eaf8caa890ce-bound-sa-token\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.689603 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.689585 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdvj2\" (UniqueName: \"kubernetes.io/projected/b93cb4a4-7991-414d-9679-eaf8caa890ce-kube-api-access-gdvj2\") pod \"image-registry-56f5ddf4f8-ww8sg\" (UID: \"b93cb4a4-7991-414d-9679-eaf8caa890ce\") " pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.779247 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.779213 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9d8c\" (UniqueName: \"kubernetes.io/projected/bc3d3928-7592-4441-a0c2-d7041e583080-kube-api-access-w9d8c\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.779425 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.779269 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bc3d3928-7592-4441-a0c2-d7041e583080-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.779425 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.779307 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bc3d3928-7592-4441-a0c2-d7041e583080-data-volume\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.779425 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.779359 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzlk4\" (UniqueName: \"kubernetes.io/projected/8ce5885b-3d73-4c7a-9648-01b63671ceaa-kube-api-access-vzlk4\") pod \"downloads-6bcc868b7-s8zf7\" (UID: \"8ce5885b-3d73-4c7a-9648-01b63671ceaa\") " pod="openshift-console/downloads-6bcc868b7-s8zf7" Apr 22 18:49:17.779425 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.779374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bc3d3928-7592-4441-a0c2-d7041e583080-crio-socket\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.779425 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.779391 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bc3d3928-7592-4441-a0c2-d7041e583080-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.779639 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.779545 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bc3d3928-7592-4441-a0c2-d7041e583080-crio-socket\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.779889 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.779870 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bc3d3928-7592-4441-a0c2-d7041e583080-data-volume\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.780116 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.780101 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bc3d3928-7592-4441-a0c2-d7041e583080-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.781747 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.781729 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bc3d3928-7592-4441-a0c2-d7041e583080-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.787178 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.787158 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9d8c\" (UniqueName: \"kubernetes.io/projected/bc3d3928-7592-4441-a0c2-d7041e583080-kube-api-access-w9d8c\") pod \"insights-runtime-extractor-bvh7k\" (UID: \"bc3d3928-7592-4441-a0c2-d7041e583080\") " pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.788087 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.788066 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzlk4\" (UniqueName: \"kubernetes.io/projected/8ce5885b-3d73-4c7a-9648-01b63671ceaa-kube-api-access-vzlk4\") pod \"downloads-6bcc868b7-s8zf7\" (UID: \"8ce5885b-3d73-4c7a-9648-01b63671ceaa\") " pod="openshift-console/downloads-6bcc868b7-s8zf7" Apr 22 18:49:17.799078 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.799054 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:17.814402 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.814374 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-s8zf7" Apr 22 18:49:17.821603 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.821575 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bvh7k" Apr 22 18:49:17.942599 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.942538 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56f5ddf4f8-ww8sg"] Apr 22 18:49:17.946175 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:49:17.946144 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb93cb4a4_7991_414d_9679_eaf8caa890ce.slice/crio-036d2b738b4601d9c2adee6f333efd9e427e2d5ae7940b37fd8bf86f893dbec4 WatchSource:0}: Error finding container 036d2b738b4601d9c2adee6f333efd9e427e2d5ae7940b37fd8bf86f893dbec4: Status 404 returned error can't find the container with id 036d2b738b4601d9c2adee6f333efd9e427e2d5ae7940b37fd8bf86f893dbec4 Apr 22 18:49:17.952173 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.952148 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-s8zf7"] Apr 22 18:49:17.955097 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:49:17.955074 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ce5885b_3d73_4c7a_9648_01b63671ceaa.slice/crio-da1efa1f952d8216971fc96f5ca495b9c01b347e8350552e4bbdebf3b7cc2bd0 WatchSource:0}: Error finding container da1efa1f952d8216971fc96f5ca495b9c01b347e8350552e4bbdebf3b7cc2bd0: Status 404 returned error can't find the container with id da1efa1f952d8216971fc96f5ca495b9c01b347e8350552e4bbdebf3b7cc2bd0 Apr 22 18:49:17.967763 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:17.967739 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bvh7k"] Apr 22 18:49:17.977510 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:49:17.977460 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3d3928_7592_4441_a0c2_d7041e583080.slice/crio-049f1d8e526ef5e5ae178844ed690d1210dda40f571564a2ebede41347502c2a WatchSource:0}: Error finding container 049f1d8e526ef5e5ae178844ed690d1210dda40f571564a2ebede41347502c2a: Status 404 returned error can't find the container with id 049f1d8e526ef5e5ae178844ed690d1210dda40f571564a2ebede41347502c2a Apr 22 18:49:18.572062 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:18.572017 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bvh7k" event={"ID":"bc3d3928-7592-4441-a0c2-d7041e583080","Type":"ContainerStarted","Data":"e92462a404da7676e9eaca90b8e7f10704aca44feae28db2f6943bb249215f3d"} Apr 22 18:49:18.572062 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:18.572065 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bvh7k" event={"ID":"bc3d3928-7592-4441-a0c2-d7041e583080","Type":"ContainerStarted","Data":"049f1d8e526ef5e5ae178844ed690d1210dda40f571564a2ebede41347502c2a"} Apr 22 18:49:18.573449 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:18.573407 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-s8zf7" event={"ID":"8ce5885b-3d73-4c7a-9648-01b63671ceaa","Type":"ContainerStarted","Data":"da1efa1f952d8216971fc96f5ca495b9c01b347e8350552e4bbdebf3b7cc2bd0"} Apr 22 18:49:18.574988 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:18.574951 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" event={"ID":"b93cb4a4-7991-414d-9679-eaf8caa890ce","Type":"ContainerStarted","Data":"177293a2a68a59de695a1a892d24d62bdceabb94436105e1e598edb0f20e620a"} Apr 22 18:49:18.574988 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:18.574984 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" event={"ID":"b93cb4a4-7991-414d-9679-eaf8caa890ce","Type":"ContainerStarted","Data":"036d2b738b4601d9c2adee6f333efd9e427e2d5ae7940b37fd8bf86f893dbec4"} Apr 22 18:49:18.575243 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:18.575220 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:19.579565 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:19.579525 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bvh7k" event={"ID":"bc3d3928-7592-4441-a0c2-d7041e583080","Type":"ContainerStarted","Data":"9df7b8c5fa5d45665ad871d59d2e4c5e7f46b37200a74fa1d0d4968e4deee2c3"} Apr 22 18:49:20.583520 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:20.583466 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bvh7k" event={"ID":"bc3d3928-7592-4441-a0c2-d7041e583080","Type":"ContainerStarted","Data":"0891eb0fdbd70d9cc363c69a75f3cba9f4251bbf1dc5e5ca399315214b1e73c2"} Apr 22 18:49:20.599186 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:20.599125 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bvh7k" podStartSLOduration=1.486869298 podStartE2EDuration="3.599107716s" podCreationTimestamp="2026-04-22 18:49:17 +0000 UTC" firstStartedPulling="2026-04-22 18:49:18.03066396 +0000 UTC m=+144.479446286" lastFinishedPulling="2026-04-22 18:49:20.142902381 +0000 UTC m=+146.591684704" observedRunningTime="2026-04-22 18:49:20.597786132 +0000 UTC m=+147.046568462" watchObservedRunningTime="2026-04-22 18:49:20.599107716 +0000 UTC m=+147.047890039" Apr 22 18:49:20.599578 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:20.599536 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" podStartSLOduration=3.5995285 podStartE2EDuration="3.5995285s" podCreationTimestamp="2026-04-22 18:49:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:18.592908434 +0000 UTC m=+145.041690767" watchObservedRunningTime="2026-04-22 18:49:20.5995285 +0000 UTC m=+147.048310831" Apr 22 18:49:27.473853 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:27.473812 2579 patch_prober.go:28] interesting pod/image-registry-7cf88557b7-mcsdx container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:49:27.474275 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:27.473877 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" podUID="ceb18df5-258c-4b70-9799-2da231a29e8f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:49:29.458048 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:29.457997 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-qd4bd" podUID="493591d7-8b3a-41e8-8c8c-40d1e116007e" Apr 22 18:49:29.472404 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:29.472360 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2qtxz" podUID="3d74ab93-b985-4a88-882c-10c6a352ac61" Apr 22 18:49:29.607796 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:29.607763 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qd4bd" Apr 22 18:49:31.169963 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:31.169919 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-z69xm" podUID="89068fbb-3d09-49a2-a369-e83cb48aaf0f" Apr 22 18:49:34.413332 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:34.413289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:49:34.413931 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:34.413900 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:49:34.416618 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:34.416595 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/493591d7-8b3a-41e8-8c8c-40d1e116007e-metrics-tls\") pod \"dns-default-qd4bd\" (UID: \"493591d7-8b3a-41e8-8c8c-40d1e116007e\") " pod="openshift-dns/dns-default-qd4bd" Apr 22 18:49:34.416745 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:34.416693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d74ab93-b985-4a88-882c-10c6a352ac61-cert\") pod \"ingress-canary-2qtxz\" (UID: \"3d74ab93-b985-4a88-882c-10c6a352ac61\") " pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:49:34.622466 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:34.622418 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-s8zf7" event={"ID":"8ce5885b-3d73-4c7a-9648-01b63671ceaa","Type":"ContainerStarted","Data":"e21d23f69c448ee8a69781fa25071287a07f30e36d3f5a802ebfea7c1dea8a4d"} Apr 22 18:49:34.622768 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:34.622742 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-s8zf7" Apr 22 18:49:34.636121 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:34.636086 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-s8zf7" Apr 22 18:49:34.638529 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:34.638459 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-s8zf7" podStartSLOduration=1.678528525 podStartE2EDuration="17.638446182s" podCreationTimestamp="2026-04-22 18:49:17 +0000 UTC" firstStartedPulling="2026-04-22 18:49:17.957461231 +0000 UTC m=+144.406243539" lastFinishedPulling="2026-04-22 18:49:33.917378888 +0000 UTC m=+160.366161196" observedRunningTime="2026-04-22 18:49:34.636453115 +0000 UTC m=+161.085235445" watchObservedRunningTime="2026-04-22 18:49:34.638446182 +0000 UTC m=+161.087228516" Apr 22 18:49:34.710512 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:34.710409 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zg5rc\"" Apr 22 18:49:34.719537 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:34.719505 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qd4bd" Apr 22 18:49:34.859686 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:34.859649 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qd4bd"] Apr 22 18:49:34.863038 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:49:34.863002 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod493591d7_8b3a_41e8_8c8c_40d1e116007e.slice/crio-ede70ee4013436c85e9ccaf6ebb89655be9641ac7e5a51d8508c1d7ac90a9c2f WatchSource:0}: Error finding container ede70ee4013436c85e9ccaf6ebb89655be9641ac7e5a51d8508c1d7ac90a9c2f: Status 404 returned error can't find the container with id ede70ee4013436c85e9ccaf6ebb89655be9641ac7e5a51d8508c1d7ac90a9c2f Apr 22 18:49:35.070339 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.070301 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-776x8"] Apr 22 18:49:35.075344 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.075320 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.077432 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.077405 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:49:35.077681 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.077658 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:49:35.077818 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.077796 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:49:35.077891 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.077878 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:49:35.078105 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.077954 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:49:35.078105 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.077801 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sw8gt\"" Apr 22 18:49:35.078105 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.078076 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:49:35.121462 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.121417 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-accelerators-collector-config\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.121688 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.121480 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1e48e07e-79a3-4925-a19a-b296810004a0-root\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.121688 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.121625 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5bdg\" (UniqueName: \"kubernetes.io/projected/1e48e07e-79a3-4925-a19a-b296810004a0-kube-api-access-z5bdg\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.121688 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.121673 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e48e07e-79a3-4925-a19a-b296810004a0-sys\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.121840 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.121751 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.121840 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.121777 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-textfile\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.121840 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.121814 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-tls\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.121980 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.121849 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-wtmp\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.121980 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.121947 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e48e07e-79a3-4925-a19a-b296810004a0-metrics-client-ca\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.222598 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.222551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-tls\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.222787 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.222608 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-wtmp\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.222787 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.222651 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e48e07e-79a3-4925-a19a-b296810004a0-metrics-client-ca\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.222787 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.222700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-accelerators-collector-config\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.222787 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:35.222734 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:49:35.222992 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.222743 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1e48e07e-79a3-4925-a19a-b296810004a0-root\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.222992 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:49:35.222822 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-tls podName:1e48e07e-79a3-4925-a19a-b296810004a0 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:35.722801337 +0000 UTC m=+162.171583645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-tls") pod "node-exporter-776x8" (UID: "1e48e07e-79a3-4925-a19a-b296810004a0") : secret "node-exporter-tls" not found Apr 22 18:49:35.222992 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.222836 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-wtmp\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.222992 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.222908 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5bdg\" (UniqueName: \"kubernetes.io/projected/1e48e07e-79a3-4925-a19a-b296810004a0-kube-api-access-z5bdg\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.222992 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.222916 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1e48e07e-79a3-4925-a19a-b296810004a0-root\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.222992 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.222949 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e48e07e-79a3-4925-a19a-b296810004a0-sys\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.223282 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.222995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.223282 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.223022 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-textfile\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.223282 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.223040 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e48e07e-79a3-4925-a19a-b296810004a0-sys\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.223428 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.223343 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e48e07e-79a3-4925-a19a-b296810004a0-metrics-client-ca\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.223428 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.223343 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-accelerators-collector-config\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.223428 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.223405 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-textfile\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.225637 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.225615 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.233812 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.233789 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5bdg\" (UniqueName: \"kubernetes.io/projected/1e48e07e-79a3-4925-a19a-b296810004a0-kube-api-access-z5bdg\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.627581 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.627546 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qd4bd" event={"ID":"493591d7-8b3a-41e8-8c8c-40d1e116007e","Type":"ContainerStarted","Data":"ede70ee4013436c85e9ccaf6ebb89655be9641ac7e5a51d8508c1d7ac90a9c2f"} Apr 22 18:49:35.727389 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.727346 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-tls\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.731654 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.731612 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1e48e07e-79a3-4925-a19a-b296810004a0-node-exporter-tls\") pod \"node-exporter-776x8\" (UID: \"1e48e07e-79a3-4925-a19a-b296810004a0\") " pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:35.988147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:35.987677 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-776x8" Apr 22 18:49:36.630984 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:36.630948 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-776x8" event={"ID":"1e48e07e-79a3-4925-a19a-b296810004a0","Type":"ContainerStarted","Data":"9e58c314370e3fdff2e5ad2644240dac18995172e18d11c330082a1df12f093c"} Apr 22 18:49:36.632501 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:36.632458 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qd4bd" event={"ID":"493591d7-8b3a-41e8-8c8c-40d1e116007e","Type":"ContainerStarted","Data":"56c01ed7c944535e35dc3b2f529ac066962d5c3bbfbca10e18b431f2a21dd473"} Apr 22 18:49:37.472619 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:37.472577 2579 patch_prober.go:28] interesting pod/image-registry-7cf88557b7-mcsdx container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:49:37.472799 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:37.472632 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" podUID="ceb18df5-258c-4b70-9799-2da231a29e8f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:49:37.636862 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:37.636824 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qd4bd" event={"ID":"493591d7-8b3a-41e8-8c8c-40d1e116007e","Type":"ContainerStarted","Data":"b23048bb2324b43d540df5148f7ef0bc399dbcb62e85e45835a8fe8bec981a47"} Apr 22 18:49:37.637252 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:37.637012 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qd4bd" Apr 22 18:49:37.651419 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:37.651364 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qd4bd" podStartSLOduration=130.089796751 podStartE2EDuration="2m11.651346613s" podCreationTimestamp="2026-04-22 18:47:26 +0000 UTC" firstStartedPulling="2026-04-22 18:49:34.865113762 +0000 UTC m=+161.313896071" lastFinishedPulling="2026-04-22 18:49:36.426663606 +0000 UTC m=+162.875445933" observedRunningTime="2026-04-22 18:49:37.650656301 +0000 UTC m=+164.099438643" watchObservedRunningTime="2026-04-22 18:49:37.651346613 +0000 UTC m=+164.100128945" Apr 22 18:49:37.804147 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:37.804101 2579 patch_prober.go:28] interesting pod/image-registry-56f5ddf4f8-ww8sg container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:49:37.804299 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:37.804161 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" podUID="b93cb4a4-7991-414d-9679-eaf8caa890ce" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:49:38.641616 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:38.641570 2579 generic.go:358] "Generic (PLEG): container finished" podID="1e48e07e-79a3-4925-a19a-b296810004a0" containerID="55b40baa96f89b61f48a192d3516bd248e621348d4158a59d2cfd62e234b3db0" exitCode=0 Apr 22 18:49:38.642079 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:38.641673 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-776x8" event={"ID":"1e48e07e-79a3-4925-a19a-b296810004a0","Type":"ContainerDied","Data":"55b40baa96f89b61f48a192d3516bd248e621348d4158a59d2cfd62e234b3db0"} Apr 22 18:49:39.449874 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.449835 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-84d4bd4d44-fbqmd"] Apr 22 18:49:39.463902 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.463870 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-84d4bd4d44-fbqmd"] Apr 22 18:49:39.464075 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.463979 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.466004 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.465981 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 18:49:39.466752 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.466668 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:49:39.466752 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.466681 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-jnzvt\"" Apr 22 18:49:39.466874 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.466671 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 18:49:39.466874 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.466825 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-2913ogejloi1p\"" Apr 22 18:49:39.466955 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.466910 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 18:49:39.562359 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.562274 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2c00d446-bbf6-4aab-a27c-ed0836ea1765-secret-metrics-server-tls\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.562359 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.562318 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2c00d446-bbf6-4aab-a27c-ed0836ea1765-secret-metrics-server-client-certs\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.562615 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.562412 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c00d446-bbf6-4aab-a27c-ed0836ea1765-client-ca-bundle\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.562615 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.562504 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c00d446-bbf6-4aab-a27c-ed0836ea1765-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.562615 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.562594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2c00d446-bbf6-4aab-a27c-ed0836ea1765-metrics-server-audit-profiles\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.562747 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.562627 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zzbs\" (UniqueName: \"kubernetes.io/projected/2c00d446-bbf6-4aab-a27c-ed0836ea1765-kube-api-access-7zzbs\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.562747 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.562653 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2c00d446-bbf6-4aab-a27c-ed0836ea1765-audit-log\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.584425 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.584396 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-56f5ddf4f8-ww8sg" Apr 22 18:49:39.647592 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.647547 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-776x8" event={"ID":"1e48e07e-79a3-4925-a19a-b296810004a0","Type":"ContainerStarted","Data":"5ac1776aab9c3e7e84adc80b76eaed24a75dbbc541d24a8ac3228e4820a702c9"} Apr 22 18:49:39.647592 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.647590 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-776x8" event={"ID":"1e48e07e-79a3-4925-a19a-b296810004a0","Type":"ContainerStarted","Data":"afc13f6ba4a11c9dc3943083736bd03cd675b5689a9768ae660eb7408886e173"} Apr 22 18:49:39.663682 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.663651 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2c00d446-bbf6-4aab-a27c-ed0836ea1765-metrics-server-audit-profiles\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.663860 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.663697 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zzbs\" (UniqueName: \"kubernetes.io/projected/2c00d446-bbf6-4aab-a27c-ed0836ea1765-kube-api-access-7zzbs\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.663860 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.663727 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2c00d446-bbf6-4aab-a27c-ed0836ea1765-audit-log\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.663966 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.663929 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2c00d446-bbf6-4aab-a27c-ed0836ea1765-secret-metrics-server-tls\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.664016 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.663973 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2c00d446-bbf6-4aab-a27c-ed0836ea1765-secret-metrics-server-client-certs\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.664064 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.664034 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c00d446-bbf6-4aab-a27c-ed0836ea1765-client-ca-bundle\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.664135 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.664114 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c00d446-bbf6-4aab-a27c-ed0836ea1765-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.664236 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.664216 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2c00d446-bbf6-4aab-a27c-ed0836ea1765-audit-log\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.664973 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.664821 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2c00d446-bbf6-4aab-a27c-ed0836ea1765-metrics-server-audit-profiles\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.665442 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.665417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c00d446-bbf6-4aab-a27c-ed0836ea1765-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.666861 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.666839 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2c00d446-bbf6-4aab-a27c-ed0836ea1765-secret-metrics-server-client-certs\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.672166 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.672143 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2c00d446-bbf6-4aab-a27c-ed0836ea1765-secret-metrics-server-tls\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.672292 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.672270 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c00d446-bbf6-4aab-a27c-ed0836ea1765-client-ca-bundle\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.677232 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.677211 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zzbs\" (UniqueName: \"kubernetes.io/projected/2c00d446-bbf6-4aab-a27c-ed0836ea1765-kube-api-access-7zzbs\") pod \"metrics-server-84d4bd4d44-fbqmd\" (UID: \"2c00d446-bbf6-4aab-a27c-ed0836ea1765\") " pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.775362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.775326 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:39.917013 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.916763 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-776x8" podStartSLOduration=3.481072385 podStartE2EDuration="4.916739181s" podCreationTimestamp="2026-04-22 18:49:35 +0000 UTC" firstStartedPulling="2026-04-22 18:49:36.232239277 +0000 UTC m=+162.681021598" lastFinishedPulling="2026-04-22 18:49:37.667906075 +0000 UTC m=+164.116688394" observedRunningTime="2026-04-22 18:49:39.664778092 +0000 UTC m=+166.113560428" watchObservedRunningTime="2026-04-22 18:49:39.916739181 +0000 UTC m=+166.365521512" Apr 22 18:49:39.917310 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:39.917288 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-84d4bd4d44-fbqmd"] Apr 22 18:49:39.922601 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:49:39.922567 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c00d446_bbf6_4aab_a27c_ed0836ea1765.slice/crio-01d42fc6b53669136d9493fed07f67d0708f64de2ddf00393fa6a6e9c6d3834f WatchSource:0}: Error finding container 01d42fc6b53669136d9493fed07f67d0708f64de2ddf00393fa6a6e9c6d3834f: Status 404 returned error can't find the container with id 01d42fc6b53669136d9493fed07f67d0708f64de2ddf00393fa6a6e9c6d3834f Apr 22 18:49:40.236557 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.236524 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-854f54fd88-552xv"] Apr 22 18:49:40.255809 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.255738 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.257037 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.257001 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-854f54fd88-552xv"] Apr 22 18:49:40.258094 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.258072 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 18:49:40.258753 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.258685 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 18:49:40.258860 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.258822 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 18:49:40.258969 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.258946 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 18:49:40.259088 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.258998 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-p7d8c\"" Apr 22 18:49:40.259088 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.259017 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 18:49:40.265397 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.265165 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 18:49:40.370140 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.370098 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-telemeter-client-tls\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.370343 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.370162 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-telemeter-trusted-ca-bundle\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.370343 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.370199 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-serving-certs-ca-bundle\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.370343 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.370222 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-federate-client-tls\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.370343 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.370248 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.370343 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.370321 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqm47\" (UniqueName: \"kubernetes.io/projected/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-kube-api-access-xqm47\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.370606 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.370371 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-secret-telemeter-client\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.370606 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.370406 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-metrics-client-ca\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.471845 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.471805 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-serving-certs-ca-bundle\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.472023 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.471850 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-federate-client-tls\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.472023 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.471883 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.472023 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.471925 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqm47\" (UniqueName: \"kubernetes.io/projected/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-kube-api-access-xqm47\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.472023 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.471971 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-secret-telemeter-client\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.472023 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.472005 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-metrics-client-ca\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.472261 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.472084 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-telemeter-client-tls\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.472261 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.472126 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-telemeter-trusted-ca-bundle\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.472709 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.472572 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-serving-certs-ca-bundle\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.473125 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.473099 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-metrics-client-ca\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.473229 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.473154 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-telemeter-trusted-ca-bundle\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.476068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.475026 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-federate-client-tls\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.476068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.475473 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-telemeter-client-tls\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.476068 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.475793 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-secret-telemeter-client\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.477658 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.477567 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.479940 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.479898 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqm47\" (UniqueName: \"kubernetes.io/projected/8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494-kube-api-access-xqm47\") pod \"telemeter-client-854f54fd88-552xv\" (UID: \"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494\") " pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.569256 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.568991 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" Apr 22 18:49:40.653190 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.653155 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" event={"ID":"2c00d446-bbf6-4aab-a27c-ed0836ea1765","Type":"ContainerStarted","Data":"01d42fc6b53669136d9493fed07f67d0708f64de2ddf00393fa6a6e9c6d3834f"} Apr 22 18:49:40.724771 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:40.724708 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-854f54fd88-552xv"] Apr 22 18:49:40.727552 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:49:40.727520 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b0f7b21_b1b4_4d29_a1f1_54be7d7fa494.slice/crio-51f4b95eeb41bb0a95bc5092044fb79b9420209a981e43536b4383d028101ac2 WatchSource:0}: Error finding container 51f4b95eeb41bb0a95bc5092044fb79b9420209a981e43536b4383d028101ac2: Status 404 returned error can't find the container with id 51f4b95eeb41bb0a95bc5092044fb79b9420209a981e43536b4383d028101ac2 Apr 22 18:49:41.262786 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.262749 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:49:41.291102 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.291067 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:49:41.291315 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.291294 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.294311 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.294277 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:49:41.294578 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.294478 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:49:41.294916 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.294880 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:49:41.295097 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.295073 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:49:41.296013 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.295286 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:49:41.296013 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.295557 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:49:41.296013 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.295618 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-9jfotpgv8prin\"" Apr 22 18:49:41.296013 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.295785 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:49:41.296013 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.295796 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-6zkff\"" Apr 22 18:49:41.296317 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.296056 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:49:41.296498 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.296460 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:49:41.296631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.296459 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:49:41.301410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.301365 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:49:41.311229 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.311206 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:49:41.381439 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381312 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/499f8028-1941-4aaf-a1fa-a18e381ba1ea-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381439 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381362 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381439 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381399 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381439 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381428 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctrnj\" (UniqueName: \"kubernetes.io/projected/499f8028-1941-4aaf-a1fa-a18e381ba1ea-kube-api-access-ctrnj\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381946 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381457 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381946 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381535 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381946 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381570 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381946 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381639 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381946 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381666 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381946 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381712 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-config\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381946 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381734 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/499f8028-1941-4aaf-a1fa-a18e381ba1ea-config-out\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381946 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381946 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381808 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381946 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.381946 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381932 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.382423 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381960 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-web-config\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.382423 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.381997 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.382423 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.382053 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.482771 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.482731 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.482958 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.482786 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-web-config\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.482958 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.482813 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.482958 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.482860 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.482958 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.482921 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/499f8028-1941-4aaf-a1fa-a18e381ba1ea-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.482958 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.482950 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.483247 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.482987 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.483247 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483017 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctrnj\" (UniqueName: \"kubernetes.io/projected/499f8028-1941-4aaf-a1fa-a18e381ba1ea-kube-api-access-ctrnj\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.483247 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.483247 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483103 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.483247 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483128 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.483247 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.483247 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483223 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.483802 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-config\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.483802 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483288 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/499f8028-1941-4aaf-a1fa-a18e381ba1ea-config-out\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.483802 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483339 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.483802 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483367 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.483802 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483392 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.484056 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483820 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.484056 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483880 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.484056 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.483917 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.485410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.484295 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.486569 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.485912 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/499f8028-1941-4aaf-a1fa-a18e381ba1ea-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.486569 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.486166 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.486989 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.486611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-web-config\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.486989 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.486627 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.486989 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.486705 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.486989 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.486877 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.489544 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.488341 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/499f8028-1941-4aaf-a1fa-a18e381ba1ea-config-out\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.489544 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.489226 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.489544 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.489306 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.489749 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.489605 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-config\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.490336 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.490294 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.490771 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.490748 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.491959 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.491933 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctrnj\" (UniqueName: \"kubernetes.io/projected/499f8028-1941-4aaf-a1fa-a18e381ba1ea-kube-api-access-ctrnj\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.504278 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.504249 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.606006 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.605907 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:41.660251 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.660190 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" event={"ID":"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494","Type":"ContainerStarted","Data":"51f4b95eeb41bb0a95bc5092044fb79b9420209a981e43536b4383d028101ac2"} Apr 22 18:49:41.750339 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:41.750310 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:49:41.874532 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:49:41.874415 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod499f8028_1941_4aaf_a1fa_a18e381ba1ea.slice/crio-25db05e131eb9d29a9c81783fb2e432e18f83a8d8a7954001e81868aa8643bc7 WatchSource:0}: Error finding container 25db05e131eb9d29a9c81783fb2e432e18f83a8d8a7954001e81868aa8643bc7: Status 404 returned error can't find the container with id 25db05e131eb9d29a9c81783fb2e432e18f83a8d8a7954001e81868aa8643bc7 Apr 22 18:49:42.486869 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.486678 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" podUID="ceb18df5-258c-4b70-9799-2da231a29e8f" containerName="registry" containerID="cri-o://0284dbfa08ce0a8cfcfe69f2fb18e784b714ed18effbd0763786fb5f2d7088c2" gracePeriod=30 Apr 22 18:49:42.665515 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.665459 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" event={"ID":"2c00d446-bbf6-4aab-a27c-ed0836ea1765","Type":"ContainerStarted","Data":"b3b9959845cab42325f12e5a1003f53533b387a84c3b7b246c909feb956fee29"} Apr 22 18:49:42.667258 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.667229 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerStarted","Data":"25db05e131eb9d29a9c81783fb2e432e18f83a8d8a7954001e81868aa8643bc7"} Apr 22 18:49:42.670437 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.670349 2579 generic.go:358] "Generic (PLEG): container finished" podID="ceb18df5-258c-4b70-9799-2da231a29e8f" containerID="0284dbfa08ce0a8cfcfe69f2fb18e784b714ed18effbd0763786fb5f2d7088c2" exitCode=0 Apr 22 18:49:42.670437 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.670406 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" event={"ID":"ceb18df5-258c-4b70-9799-2da231a29e8f","Type":"ContainerDied","Data":"0284dbfa08ce0a8cfcfe69f2fb18e784b714ed18effbd0763786fb5f2d7088c2"} Apr 22 18:49:42.778313 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.778285 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:49:42.796221 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.796132 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" podStartSLOduration=1.8203601680000001 podStartE2EDuration="3.796110296s" podCreationTimestamp="2026-04-22 18:49:39 +0000 UTC" firstStartedPulling="2026-04-22 18:49:39.925036564 +0000 UTC m=+166.373818884" lastFinishedPulling="2026-04-22 18:49:41.900786698 +0000 UTC m=+168.349569012" observedRunningTime="2026-04-22 18:49:42.681274447 +0000 UTC m=+169.130056778" watchObservedRunningTime="2026-04-22 18:49:42.796110296 +0000 UTC m=+169.244892628" Apr 22 18:49:42.898902 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.898869 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ceb18df5-258c-4b70-9799-2da231a29e8f-ca-trust-extracted\") pod \"ceb18df5-258c-4b70-9799-2da231a29e8f\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " Apr 22 18:49:42.899115 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.898916 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ceb18df5-258c-4b70-9799-2da231a29e8f-trusted-ca\") pod \"ceb18df5-258c-4b70-9799-2da231a29e8f\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " Apr 22 18:49:42.899115 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.898964 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ceb18df5-258c-4b70-9799-2da231a29e8f-image-registry-private-configuration\") pod \"ceb18df5-258c-4b70-9799-2da231a29e8f\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " Apr 22 18:49:42.899115 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.898993 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-certificates\") pod \"ceb18df5-258c-4b70-9799-2da231a29e8f\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " Apr 22 18:49:42.899115 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.899039 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvhd6\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-kube-api-access-kvhd6\") pod \"ceb18df5-258c-4b70-9799-2da231a29e8f\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " Apr 22 18:49:42.899115 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.899078 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls\") pod \"ceb18df5-258c-4b70-9799-2da231a29e8f\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " Apr 22 18:49:42.899387 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.899117 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-bound-sa-token\") pod \"ceb18df5-258c-4b70-9799-2da231a29e8f\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " Apr 22 18:49:42.899387 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.899152 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ceb18df5-258c-4b70-9799-2da231a29e8f-installation-pull-secrets\") pod \"ceb18df5-258c-4b70-9799-2da231a29e8f\" (UID: \"ceb18df5-258c-4b70-9799-2da231a29e8f\") " Apr 22 18:49:42.900052 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.900020 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceb18df5-258c-4b70-9799-2da231a29e8f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ceb18df5-258c-4b70-9799-2da231a29e8f" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:42.900380 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.900344 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ceb18df5-258c-4b70-9799-2da231a29e8f" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:42.905098 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.905064 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb18df5-258c-4b70-9799-2da231a29e8f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ceb18df5-258c-4b70-9799-2da231a29e8f" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:49:42.905325 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.905210 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb18df5-258c-4b70-9799-2da231a29e8f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ceb18df5-258c-4b70-9799-2da231a29e8f" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:49:42.906382 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.906293 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ceb18df5-258c-4b70-9799-2da231a29e8f" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:42.906590 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.906388 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-kube-api-access-kvhd6" (OuterVolumeSpecName: "kube-api-access-kvhd6") pod "ceb18df5-258c-4b70-9799-2da231a29e8f" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f"). InnerVolumeSpecName "kube-api-access-kvhd6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:42.907020 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.906975 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ceb18df5-258c-4b70-9799-2da231a29e8f" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:42.914393 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:42.914366 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb18df5-258c-4b70-9799-2da231a29e8f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ceb18df5-258c-4b70-9799-2da231a29e8f" (UID: "ceb18df5-258c-4b70-9799-2da231a29e8f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:49:43.000285 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.000248 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ceb18df5-258c-4b70-9799-2da231a29e8f-image-registry-private-configuration\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:49:43.000285 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.000285 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-certificates\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:49:43.000630 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.000301 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kvhd6\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-kube-api-access-kvhd6\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:49:43.000630 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.000315 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-registry-tls\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:49:43.000630 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.000329 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ceb18df5-258c-4b70-9799-2da231a29e8f-bound-sa-token\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:49:43.000630 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.000339 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ceb18df5-258c-4b70-9799-2da231a29e8f-installation-pull-secrets\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:49:43.000630 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.000348 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ceb18df5-258c-4b70-9799-2da231a29e8f-ca-trust-extracted\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:49:43.000630 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.000358 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ceb18df5-258c-4b70-9799-2da231a29e8f-trusted-ca\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:49:43.160089 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.160009 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:49:43.162540 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.162499 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rxjsb\"" Apr 22 18:49:43.171338 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.171299 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2qtxz" Apr 22 18:49:43.675412 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.675372 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" event={"ID":"ceb18df5-258c-4b70-9799-2da231a29e8f","Type":"ContainerDied","Data":"f4a8323910d04baef32f444e7125df5b552238cc61e3a7aa8ef619e05a2b486b"} Apr 22 18:49:43.675412 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.675425 2579 scope.go:117] "RemoveContainer" containerID="0284dbfa08ce0a8cfcfe69f2fb18e784b714ed18effbd0763786fb5f2d7088c2" Apr 22 18:49:43.675953 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.675449 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cf88557b7-mcsdx" Apr 22 18:49:43.698054 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.698022 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7cf88557b7-mcsdx"] Apr 22 18:49:43.700099 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:43.700075 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7cf88557b7-mcsdx"] Apr 22 18:49:44.169224 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:44.169190 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceb18df5-258c-4b70-9799-2da231a29e8f" path="/var/lib/kubelet/pods/ceb18df5-258c-4b70-9799-2da231a29e8f/volumes" Apr 22 18:49:44.273400 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:44.273372 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2qtxz"] Apr 22 18:49:44.275955 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:49:44.275923 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d74ab93_b985_4a88_882c_10c6a352ac61.slice/crio-90f21f99e3ff92e783ae75ad89970e2fbe2b68c57c8f77fcb3dfc0bdfb24fd9c WatchSource:0}: Error finding container 90f21f99e3ff92e783ae75ad89970e2fbe2b68c57c8f77fcb3dfc0bdfb24fd9c: Status 404 returned error can't find the container with id 90f21f99e3ff92e783ae75ad89970e2fbe2b68c57c8f77fcb3dfc0bdfb24fd9c Apr 22 18:49:44.680505 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:44.680450 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" event={"ID":"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494","Type":"ContainerStarted","Data":"9ffde869d68f85627b520ffcdb09f825d7ccf69844e65ab8be1348bc6c226ce1"} Apr 22 18:49:44.680505 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:44.680507 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" event={"ID":"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494","Type":"ContainerStarted","Data":"430d8aefd1527eb44a955ca2751da2bb6ff8a9a369f2be21b9fea765cce7877b"} Apr 22 18:49:44.681010 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:44.680527 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" event={"ID":"8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494","Type":"ContainerStarted","Data":"5383d2d203a881574f7e33e06356265ba592ec87093da9b004c7fa7ea0f25365"} Apr 22 18:49:44.681782 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:44.681760 2579 generic.go:358] "Generic (PLEG): container finished" podID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerID="6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07" exitCode=0 Apr 22 18:49:44.681896 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:44.681811 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerDied","Data":"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07"} Apr 22 18:49:44.682890 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:44.682806 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2qtxz" event={"ID":"3d74ab93-b985-4a88-882c-10c6a352ac61","Type":"ContainerStarted","Data":"90f21f99e3ff92e783ae75ad89970e2fbe2b68c57c8f77fcb3dfc0bdfb24fd9c"} Apr 22 18:49:44.699338 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:44.699250 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-854f54fd88-552xv" podStartSLOduration=1.278043875 podStartE2EDuration="4.699236816s" podCreationTimestamp="2026-04-22 18:49:40 +0000 UTC" firstStartedPulling="2026-04-22 18:49:40.730169774 +0000 UTC m=+167.178952083" lastFinishedPulling="2026-04-22 18:49:44.151362716 +0000 UTC m=+170.600145024" observedRunningTime="2026-04-22 18:49:44.697245128 +0000 UTC m=+171.146027460" watchObservedRunningTime="2026-04-22 18:49:44.699236816 +0000 UTC m=+171.148019145" Apr 22 18:49:46.159536 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:46.159494 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:49:46.694309 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:46.694215 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2qtxz" event={"ID":"3d74ab93-b985-4a88-882c-10c6a352ac61","Type":"ContainerStarted","Data":"e5c2e0becdaedee53c85172446cb897d1571fbebdeee0017ea34a1d72b509862"} Apr 22 18:49:46.708637 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:46.708582 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2qtxz" podStartSLOduration=138.60732004 podStartE2EDuration="2m20.708566296s" podCreationTimestamp="2026-04-22 18:47:26 +0000 UTC" firstStartedPulling="2026-04-22 18:49:44.27784938 +0000 UTC m=+170.726631688" lastFinishedPulling="2026-04-22 18:49:46.379095636 +0000 UTC m=+172.827877944" observedRunningTime="2026-04-22 18:49:46.707597105 +0000 UTC m=+173.156379437" watchObservedRunningTime="2026-04-22 18:49:46.708566296 +0000 UTC m=+173.157348626" Apr 22 18:49:47.643959 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:47.643925 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qd4bd" Apr 22 18:49:48.702066 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:48.702030 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerStarted","Data":"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341"} Apr 22 18:49:48.702066 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:48.702072 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerStarted","Data":"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44"} Apr 22 18:49:50.710742 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:50.710655 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerStarted","Data":"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894"} Apr 22 18:49:50.710742 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:50.710691 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerStarted","Data":"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b"} Apr 22 18:49:50.710742 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:50.710703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerStarted","Data":"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3"} Apr 22 18:49:50.710742 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:50.710711 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerStarted","Data":"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c"} Apr 22 18:49:50.734931 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:50.734870 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.5028312449999999 podStartE2EDuration="9.734854859s" podCreationTimestamp="2026-04-22 18:49:41 +0000 UTC" firstStartedPulling="2026-04-22 18:49:41.895958197 +0000 UTC m=+168.344740512" lastFinishedPulling="2026-04-22 18:49:50.127981808 +0000 UTC m=+176.576764126" observedRunningTime="2026-04-22 18:49:50.733315718 +0000 UTC m=+177.182098049" watchObservedRunningTime="2026-04-22 18:49:50.734854859 +0000 UTC m=+177.183637189" Apr 22 18:49:51.606053 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:51.606013 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:59.776335 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:59.776293 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:49:59.776797 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:49:59.776364 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:50:19.783358 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:19.783319 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:50:19.792754 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:19.792725 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-84d4bd4d44-fbqmd" Apr 22 18:50:25.808800 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:25.808764 2579 generic.go:358] "Generic (PLEG): container finished" podID="a74fd54d-3421-40fe-967e-0f3839d3664e" containerID="a155f3913cbc9dbd64c981675caf479b286fb98725c71112e23e968b4308d73b" exitCode=0 Apr 22 18:50:25.809316 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:25.808846 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" event={"ID":"a74fd54d-3421-40fe-967e-0f3839d3664e","Type":"ContainerDied","Data":"a155f3913cbc9dbd64c981675caf479b286fb98725c71112e23e968b4308d73b"} Apr 22 18:50:25.809316 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:25.809263 2579 scope.go:117] "RemoveContainer" containerID="a155f3913cbc9dbd64c981675caf479b286fb98725c71112e23e968b4308d73b" Apr 22 18:50:26.813410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:26.813377 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x6vrp" event={"ID":"a74fd54d-3421-40fe-967e-0f3839d3664e","Type":"ContainerStarted","Data":"0ceb6b33cb3a3464fcfa9d6bed2f374b63451d1812c12ab30545b5e35bc24cba"} Apr 22 18:50:41.606396 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:41.606351 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:41.622004 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:41.621954 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:41.869578 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:41.869503 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:59.598975 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.598938 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:50:59.599399 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.599343 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="prometheus" containerID="cri-o://1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44" gracePeriod=600 Apr 22 18:50:59.599464 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.599372 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="kube-rbac-proxy" containerID="cri-o://623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b" gracePeriod=600 Apr 22 18:50:59.599464 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.599393 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="thanos-sidecar" containerID="cri-o://18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c" gracePeriod=600 Apr 22 18:50:59.599464 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.599419 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="kube-rbac-proxy-web" containerID="cri-o://a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3" gracePeriod=600 Apr 22 18:50:59.599625 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.599454 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="config-reloader" containerID="cri-o://f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341" gracePeriod=600 Apr 22 18:50:59.599625 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.599461 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="kube-rbac-proxy-thanos" containerID="cri-o://4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894" gracePeriod=600 Apr 22 18:50:59.840123 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.840099 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:59.904469 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904388 2579 generic.go:358] "Generic (PLEG): container finished" podID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerID="4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894" exitCode=0 Apr 22 18:50:59.904469 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904415 2579 generic.go:358] "Generic (PLEG): container finished" podID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerID="623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b" exitCode=0 Apr 22 18:50:59.904469 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904421 2579 generic.go:358] "Generic (PLEG): container finished" podID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerID="a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3" exitCode=0 Apr 22 18:50:59.904469 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904427 2579 generic.go:358] "Generic (PLEG): container finished" podID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerID="18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c" exitCode=0 Apr 22 18:50:59.904469 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904432 2579 generic.go:358] "Generic (PLEG): container finished" podID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerID="f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341" exitCode=0 Apr 22 18:50:59.904469 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904437 2579 generic.go:358] "Generic (PLEG): container finished" podID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerID="1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44" exitCode=0 Apr 22 18:50:59.904864 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904478 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerDied","Data":"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894"} Apr 22 18:50:59.904864 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904536 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:59.904864 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904553 2579 scope.go:117] "RemoveContainer" containerID="4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894" Apr 22 18:50:59.904864 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904539 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerDied","Data":"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b"} Apr 22 18:50:59.904864 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904672 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerDied","Data":"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3"} Apr 22 18:50:59.904864 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904687 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerDied","Data":"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c"} Apr 22 18:50:59.904864 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904700 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerDied","Data":"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341"} Apr 22 18:50:59.904864 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904716 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerDied","Data":"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44"} Apr 22 18:50:59.904864 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.904732 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"499f8028-1941-4aaf-a1fa-a18e381ba1ea","Type":"ContainerDied","Data":"25db05e131eb9d29a9c81783fb2e432e18f83a8d8a7954001e81868aa8643bc7"} Apr 22 18:50:59.912119 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.912095 2579 scope.go:117] "RemoveContainer" containerID="623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b" Apr 22 18:50:59.918737 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.918712 2579 scope.go:117] "RemoveContainer" containerID="a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3" Apr 22 18:50:59.924986 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.924968 2579 scope.go:117] "RemoveContainer" containerID="18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c" Apr 22 18:50:59.931006 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.930988 2579 scope.go:117] "RemoveContainer" containerID="f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341" Apr 22 18:50:59.937191 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.937175 2579 scope.go:117] "RemoveContainer" containerID="1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44" Apr 22 18:50:59.943884 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.943865 2579 scope.go:117] "RemoveContainer" containerID="6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07" Apr 22 18:50:59.950047 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.950029 2579 scope.go:117] "RemoveContainer" containerID="4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894" Apr 22 18:50:59.950325 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:50:59.950305 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": container with ID starting with 4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894 not found: ID does not exist" containerID="4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894" Apr 22 18:50:59.950395 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.950341 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894"} err="failed to get container status \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": rpc error: code = NotFound desc = could not find container \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": container with ID starting with 4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894 not found: ID does not exist" Apr 22 18:50:59.950395 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.950387 2579 scope.go:117] "RemoveContainer" containerID="623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b" Apr 22 18:50:59.950651 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:50:59.950633 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": container with ID starting with 623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b not found: ID does not exist" containerID="623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b" Apr 22 18:50:59.950709 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.950659 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b"} err="failed to get container status \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": rpc error: code = NotFound desc = could not find container \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": container with ID starting with 623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b not found: ID does not exist" Apr 22 18:50:59.950709 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.950677 2579 scope.go:117] "RemoveContainer" containerID="a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3" Apr 22 18:50:59.950910 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:50:59.950893 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": container with ID starting with a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3 not found: ID does not exist" containerID="a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3" Apr 22 18:50:59.950973 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.950918 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3"} err="failed to get container status \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": rpc error: code = NotFound desc = could not find container \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": container with ID starting with a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3 not found: ID does not exist" Apr 22 18:50:59.950973 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.950941 2579 scope.go:117] "RemoveContainer" containerID="18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c" Apr 22 18:50:59.951174 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:50:59.951151 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": container with ID starting with 18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c not found: ID does not exist" containerID="18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c" Apr 22 18:50:59.951257 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.951178 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c"} err="failed to get container status \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": rpc error: code = NotFound desc = could not find container \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": container with ID starting with 18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c not found: ID does not exist" Apr 22 18:50:59.951257 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.951192 2579 scope.go:117] "RemoveContainer" containerID="f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341" Apr 22 18:50:59.951416 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:50:59.951395 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": container with ID starting with f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341 not found: ID does not exist" containerID="f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341" Apr 22 18:50:59.951455 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.951424 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341"} err="failed to get container status \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": rpc error: code = NotFound desc = could not find container \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": container with ID starting with f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341 not found: ID does not exist" Apr 22 18:50:59.951455 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.951440 2579 scope.go:117] "RemoveContainer" containerID="1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44" Apr 22 18:50:59.951681 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:50:59.951658 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": container with ID starting with 1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44 not found: ID does not exist" containerID="1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44" Apr 22 18:50:59.951785 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.951682 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44"} err="failed to get container status \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": rpc error: code = NotFound desc = could not find container \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": container with ID starting with 1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44 not found: ID does not exist" Apr 22 18:50:59.951785 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.951696 2579 scope.go:117] "RemoveContainer" containerID="6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07" Apr 22 18:50:59.951914 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:50:59.951897 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": container with ID starting with 6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07 not found: ID does not exist" containerID="6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07" Apr 22 18:50:59.951953 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.951919 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07"} err="failed to get container status \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": rpc error: code = NotFound desc = could not find container \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": container with ID starting with 6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07 not found: ID does not exist" Apr 22 18:50:59.951953 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.951933 2579 scope.go:117] "RemoveContainer" containerID="4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894" Apr 22 18:50:59.952148 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.952130 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894"} err="failed to get container status \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": rpc error: code = NotFound desc = could not find container \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": container with ID starting with 4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894 not found: ID does not exist" Apr 22 18:50:59.952215 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.952151 2579 scope.go:117] "RemoveContainer" containerID="623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b" Apr 22 18:50:59.952375 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.952357 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b"} err="failed to get container status \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": rpc error: code = NotFound desc = could not find container \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": container with ID starting with 623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b not found: ID does not exist" Apr 22 18:50:59.952440 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.952376 2579 scope.go:117] "RemoveContainer" containerID="a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3" Apr 22 18:50:59.952596 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.952576 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3"} err="failed to get container status \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": rpc error: code = NotFound desc = could not find container \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": container with ID starting with a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3 not found: ID does not exist" Apr 22 18:50:59.952648 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.952597 2579 scope.go:117] "RemoveContainer" containerID="18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c" Apr 22 18:50:59.952776 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.952758 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c"} err="failed to get container status \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": rpc error: code = NotFound desc = could not find container \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": container with ID starting with 18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c not found: ID does not exist" Apr 22 18:50:59.952840 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.952778 2579 scope.go:117] "RemoveContainer" containerID="f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341" Apr 22 18:50:59.952978 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.952955 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341"} err="failed to get container status \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": rpc error: code = NotFound desc = could not find container \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": container with ID starting with f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341 not found: ID does not exist" Apr 22 18:50:59.953024 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.952981 2579 scope.go:117] "RemoveContainer" containerID="1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44" Apr 22 18:50:59.953166 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.953149 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44"} err="failed to get container status \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": rpc error: code = NotFound desc = could not find container \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": container with ID starting with 1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44 not found: ID does not exist" Apr 22 18:50:59.953212 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.953167 2579 scope.go:117] "RemoveContainer" containerID="6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07" Apr 22 18:50:59.953411 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.953393 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07"} err="failed to get container status \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": rpc error: code = NotFound desc = could not find container \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": container with ID starting with 6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07 not found: ID does not exist" Apr 22 18:50:59.953480 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.953412 2579 scope.go:117] "RemoveContainer" containerID="4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894" Apr 22 18:50:59.953626 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.953610 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894"} err="failed to get container status \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": rpc error: code = NotFound desc = could not find container \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": container with ID starting with 4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894 not found: ID does not exist" Apr 22 18:50:59.953675 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.953628 2579 scope.go:117] "RemoveContainer" containerID="623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b" Apr 22 18:50:59.953855 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.953827 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b"} err="failed to get container status \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": rpc error: code = NotFound desc = could not find container \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": container with ID starting with 623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b not found: ID does not exist" Apr 22 18:50:59.953855 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.953847 2579 scope.go:117] "RemoveContainer" containerID="a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3" Apr 22 18:50:59.954036 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.954015 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3"} err="failed to get container status \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": rpc error: code = NotFound desc = could not find container \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": container with ID starting with a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3 not found: ID does not exist" Apr 22 18:50:59.954036 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.954032 2579 scope.go:117] "RemoveContainer" containerID="18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c" Apr 22 18:50:59.954241 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.954226 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c"} err="failed to get container status \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": rpc error: code = NotFound desc = could not find container \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": container with ID starting with 18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c not found: ID does not exist" Apr 22 18:50:59.954293 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.954240 2579 scope.go:117] "RemoveContainer" containerID="f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341" Apr 22 18:50:59.954422 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.954407 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341"} err="failed to get container status \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": rpc error: code = NotFound desc = could not find container \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": container with ID starting with f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341 not found: ID does not exist" Apr 22 18:50:59.954467 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.954422 2579 scope.go:117] "RemoveContainer" containerID="1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44" Apr 22 18:50:59.954634 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.954614 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44"} err="failed to get container status \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": rpc error: code = NotFound desc = could not find container \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": container with ID starting with 1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44 not found: ID does not exist" Apr 22 18:50:59.954707 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.954635 2579 scope.go:117] "RemoveContainer" containerID="6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07" Apr 22 18:50:59.954837 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.954819 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07"} err="failed to get container status \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": rpc error: code = NotFound desc = could not find container \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": container with ID starting with 6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07 not found: ID does not exist" Apr 22 18:50:59.954893 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.954837 2579 scope.go:117] "RemoveContainer" containerID="4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894" Apr 22 18:50:59.955036 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.955018 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894"} err="failed to get container status \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": rpc error: code = NotFound desc = could not find container \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": container with ID starting with 4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894 not found: ID does not exist" Apr 22 18:50:59.955076 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.955036 2579 scope.go:117] "RemoveContainer" containerID="623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b" Apr 22 18:50:59.955215 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.955196 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b"} err="failed to get container status \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": rpc error: code = NotFound desc = could not find container \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": container with ID starting with 623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b not found: ID does not exist" Apr 22 18:50:59.955264 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.955218 2579 scope.go:117] "RemoveContainer" containerID="a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3" Apr 22 18:50:59.955389 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.955375 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3"} err="failed to get container status \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": rpc error: code = NotFound desc = could not find container \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": container with ID starting with a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3 not found: ID does not exist" Apr 22 18:50:59.955447 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.955389 2579 scope.go:117] "RemoveContainer" containerID="18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c" Apr 22 18:50:59.955609 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.955593 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c"} err="failed to get container status \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": rpc error: code = NotFound desc = could not find container \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": container with ID starting with 18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c not found: ID does not exist" Apr 22 18:50:59.955657 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.955610 2579 scope.go:117] "RemoveContainer" containerID="f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341" Apr 22 18:50:59.955796 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.955782 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341"} err="failed to get container status \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": rpc error: code = NotFound desc = could not find container \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": container with ID starting with f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341 not found: ID does not exist" Apr 22 18:50:59.955844 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.955796 2579 scope.go:117] "RemoveContainer" containerID="1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44" Apr 22 18:50:59.955974 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.955960 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44"} err="failed to get container status \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": rpc error: code = NotFound desc = could not find container \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": container with ID starting with 1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44 not found: ID does not exist" Apr 22 18:50:59.956025 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.955974 2579 scope.go:117] "RemoveContainer" containerID="6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07" Apr 22 18:50:59.956183 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.956160 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07"} err="failed to get container status \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": rpc error: code = NotFound desc = could not find container \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": container with ID starting with 6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07 not found: ID does not exist" Apr 22 18:50:59.956183 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.956180 2579 scope.go:117] "RemoveContainer" containerID="4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894" Apr 22 18:50:59.956363 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.956346 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894"} err="failed to get container status \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": rpc error: code = NotFound desc = could not find container \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": container with ID starting with 4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894 not found: ID does not exist" Apr 22 18:50:59.956403 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.956364 2579 scope.go:117] "RemoveContainer" containerID="623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b" Apr 22 18:50:59.956612 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.956592 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b"} err="failed to get container status \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": rpc error: code = NotFound desc = could not find container \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": container with ID starting with 623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b not found: ID does not exist" Apr 22 18:50:59.956687 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.956614 2579 scope.go:117] "RemoveContainer" containerID="a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3" Apr 22 18:50:59.956844 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.956828 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3"} err="failed to get container status \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": rpc error: code = NotFound desc = could not find container \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": container with ID starting with a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3 not found: ID does not exist" Apr 22 18:50:59.956899 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.956845 2579 scope.go:117] "RemoveContainer" containerID="18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c" Apr 22 18:50:59.957066 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.957046 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c"} err="failed to get container status \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": rpc error: code = NotFound desc = could not find container \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": container with ID starting with 18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c not found: ID does not exist" Apr 22 18:50:59.957129 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.957066 2579 scope.go:117] "RemoveContainer" containerID="f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341" Apr 22 18:50:59.957263 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.957249 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341"} err="failed to get container status \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": rpc error: code = NotFound desc = could not find container \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": container with ID starting with f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341 not found: ID does not exist" Apr 22 18:50:59.957317 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.957264 2579 scope.go:117] "RemoveContainer" containerID="1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44" Apr 22 18:50:59.957480 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.957454 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44"} err="failed to get container status \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": rpc error: code = NotFound desc = could not find container \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": container with ID starting with 1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44 not found: ID does not exist" Apr 22 18:50:59.957593 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.957578 2579 scope.go:117] "RemoveContainer" containerID="6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07" Apr 22 18:50:59.957777 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.957761 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07"} err="failed to get container status \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": rpc error: code = NotFound desc = could not find container \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": container with ID starting with 6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07 not found: ID does not exist" Apr 22 18:50:59.957777 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.957776 2579 scope.go:117] "RemoveContainer" containerID="4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894" Apr 22 18:50:59.957994 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.957977 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894"} err="failed to get container status \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": rpc error: code = NotFound desc = could not find container \"4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894\": container with ID starting with 4ea7fadd5e6b6a17523d6ffbf998580f7b46eeac8bd1dbc350a42e9468011894 not found: ID does not exist" Apr 22 18:50:59.958036 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.957995 2579 scope.go:117] "RemoveContainer" containerID="623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b" Apr 22 18:50:59.958210 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.958194 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b"} err="failed to get container status \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": rpc error: code = NotFound desc = could not find container \"623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b\": container with ID starting with 623bdf1a645dbf11bc8f3d453435b31a82d4d9e9cf7c030d3858e5105772c58b not found: ID does not exist" Apr 22 18:50:59.958255 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.958210 2579 scope.go:117] "RemoveContainer" containerID="a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3" Apr 22 18:50:59.958395 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.958379 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3"} err="failed to get container status \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": rpc error: code = NotFound desc = could not find container \"a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3\": container with ID starting with a07f0d2dff11ce3ff2310913d17534da63b486e6a9660fe30134329814972dc3 not found: ID does not exist" Apr 22 18:50:59.958448 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.958395 2579 scope.go:117] "RemoveContainer" containerID="18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c" Apr 22 18:50:59.958632 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.958615 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c"} err="failed to get container status \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": rpc error: code = NotFound desc = could not find container \"18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c\": container with ID starting with 18653b8ab992699729d50b115d42ab3bc865b4da3dceb8dbd528bd5540dc552c not found: ID does not exist" Apr 22 18:50:59.958686 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.958633 2579 scope.go:117] "RemoveContainer" containerID="f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341" Apr 22 18:50:59.958838 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.958817 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341"} err="failed to get container status \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": rpc error: code = NotFound desc = could not find container \"f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341\": container with ID starting with f0a09ddab567504cb11ccd71b27cd35259a2b39db9bf72235a9ad457a9da4341 not found: ID does not exist" Apr 22 18:50:59.958905 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.958840 2579 scope.go:117] "RemoveContainer" containerID="1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44" Apr 22 18:50:59.959078 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.959055 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44"} err="failed to get container status \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": rpc error: code = NotFound desc = could not find container \"1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44\": container with ID starting with 1d07c55ce0256f969f76ca2cb120d6cf146dc80072c356f615e632e8162ebf44 not found: ID does not exist" Apr 22 18:50:59.959125 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.959080 2579 scope.go:117] "RemoveContainer" containerID="6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07" Apr 22 18:50:59.959272 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.959255 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07"} err="failed to get container status \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": rpc error: code = NotFound desc = could not find container \"6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07\": container with ID starting with 6b3e273f7d4eb0e5daf4d9ddd952d3a0eced40ce29aae842c0318f3518127c07 not found: ID does not exist" Apr 22 18:50:59.990708 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.990672 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/499f8028-1941-4aaf-a1fa-a18e381ba1ea-config-out\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.990848 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.990717 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-thanos-prometheus-http-client-file\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.990848 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.990748 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-kube-rbac-proxy\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.990848 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.990773 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-web-config\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.990848 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.990792 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-serving-certs-ca-bundle\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991070 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.990889 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-config\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991070 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.990947 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-trusted-ca-bundle\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991070 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.990995 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-metrics-client-ca\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991070 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991023 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-grpc-tls\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991070 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991067 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-metrics-client-certs\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991325 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991102 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-k8s-db\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991325 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991149 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-k8s-rulefiles-0\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991325 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991180 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/499f8028-1941-4aaf-a1fa-a18e381ba1ea-tls-assets\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991325 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991187 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:59.991325 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991205 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-kubelet-serving-ca-bundle\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991325 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991242 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctrnj\" (UniqueName: \"kubernetes.io/projected/499f8028-1941-4aaf-a1fa-a18e381ba1ea-kube-api-access-ctrnj\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991325 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991271 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991325 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991304 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-tls\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991742 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991330 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\" (UID: \"499f8028-1941-4aaf-a1fa-a18e381ba1ea\") " Apr 22 18:50:59.991742 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991582 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:59.991742 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.991610 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:50:59.992223 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.992196 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:59.992628 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.992602 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:59.993117 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.993047 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:59.993666 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.993642 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:59.993882 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.993863 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/499f8028-1941-4aaf-a1fa-a18e381ba1ea-config-out" (OuterVolumeSpecName: "config-out") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:59.994340 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.994048 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-config" (OuterVolumeSpecName: "config") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:59.994472 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.994284 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:59.995082 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.995050 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:59.995272 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.995252 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:59.995272 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.995257 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/499f8028-1941-4aaf-a1fa-a18e381ba1ea-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:59.995742 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.995718 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:59.995742 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.995727 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:59.995873 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.995824 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:59.995915 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.995889 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:59.996706 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:50:59.996686 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/499f8028-1941-4aaf-a1fa-a18e381ba1ea-kube-api-access-ctrnj" (OuterVolumeSpecName: "kube-api-access-ctrnj") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "kube-api-access-ctrnj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:00.004389 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.004366 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-web-config" (OuterVolumeSpecName: "web-config") pod "499f8028-1941-4aaf-a1fa-a18e381ba1ea" (UID: "499f8028-1941-4aaf-a1fa-a18e381ba1ea"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:00.092565 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092525 2579 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-kube-rbac-proxy\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092565 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092556 2579 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-web-config\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092565 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092566 2579 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-config\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092565 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092575 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-trusted-ca-bundle\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092586 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-metrics-client-ca\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092595 2579 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-grpc-tls\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092604 2579 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-metrics-client-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092613 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-k8s-db\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092621 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092629 2579 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/499f8028-1941-4aaf-a1fa-a18e381ba1ea-tls-assets\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092638 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499f8028-1941-4aaf-a1fa-a18e381ba1ea-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092646 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ctrnj\" (UniqueName: \"kubernetes.io/projected/499f8028-1941-4aaf-a1fa-a18e381ba1ea-kube-api-access-ctrnj\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092655 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092664 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-tls\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092672 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092708 2579 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/499f8028-1941-4aaf-a1fa-a18e381ba1ea-config-out\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.092826 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.092717 2579 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/499f8028-1941-4aaf-a1fa-a18e381ba1ea-thanos-prometheus-http-client-file\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.224411 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.224379 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:51:00.229276 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.229248 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:51:00.258113 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258086 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:51:00.258336 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258324 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="kube-rbac-proxy" Apr 22 18:51:00.258389 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258338 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="kube-rbac-proxy" Apr 22 18:51:00.258389 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258351 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="kube-rbac-proxy-web" Apr 22 18:51:00.258389 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258357 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="kube-rbac-proxy-web" Apr 22 18:51:00.258389 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258364 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="init-config-reloader" Apr 22 18:51:00.258389 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258369 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="init-config-reloader" Apr 22 18:51:00.258389 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258376 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="config-reloader" Apr 22 18:51:00.258389 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258381 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="config-reloader" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258392 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="kube-rbac-proxy-thanos" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258397 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="kube-rbac-proxy-thanos" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258405 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="prometheus" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258412 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="prometheus" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258420 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ceb18df5-258c-4b70-9799-2da231a29e8f" containerName="registry" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258425 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb18df5-258c-4b70-9799-2da231a29e8f" containerName="registry" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258432 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="thanos-sidecar" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258437 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="thanos-sidecar" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258478 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ceb18df5-258c-4b70-9799-2da231a29e8f" containerName="registry" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258504 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="kube-rbac-proxy" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258512 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="config-reloader" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258518 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="kube-rbac-proxy-web" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258524 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="prometheus" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258530 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="thanos-sidecar" Apr 22 18:51:00.258631 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.258536 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" containerName="kube-rbac-proxy-thanos" Apr 22 18:51:00.263841 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.263824 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.266597 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.266571 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:51:00.266786 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.266761 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:51:00.266902 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.266883 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-9jfotpgv8prin\"" Apr 22 18:51:00.266984 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.266898 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:51:00.266984 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.266913 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:51:00.266984 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.266946 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:51:00.267128 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.266915 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:51:00.267128 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.267008 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:51:00.267128 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.266945 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:51:00.267559 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.267544 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:51:00.267630 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.267597 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:51:00.267696 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.267679 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-6zkff\"" Apr 22 18:51:00.271121 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.271102 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:51:00.272718 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.272703 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:51:00.276385 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.276365 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:51:00.395919 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.395883 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c31833ba-2851-45cc-9133-315b38587337-config-out\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.395919 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.395926 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c31833ba-2851-45cc-9133-315b38587337-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396198 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.395947 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-web-config\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396198 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396028 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396198 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396198 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396076 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396198 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396138 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c31833ba-2851-45cc-9133-315b38587337-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396198 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396177 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396198 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396196 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz42b\" (UniqueName: \"kubernetes.io/projected/c31833ba-2851-45cc-9133-315b38587337-kube-api-access-wz42b\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396218 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396238 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396257 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396366 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396407 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396454 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396507 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-config\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.396710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.396563 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.497999 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.497955 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-web-config\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498197 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498197 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498070 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498197 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498114 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498197 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498140 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c31833ba-2851-45cc-9133-315b38587337-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498197 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498197 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498189 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wz42b\" (UniqueName: \"kubernetes.io/projected/c31833ba-2851-45cc-9133-315b38587337-kube-api-access-wz42b\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498519 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498216 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498519 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498241 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498519 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498519 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498328 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498519 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498351 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498519 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498395 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498519 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498430 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498519 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498460 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-config\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498519 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498510 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498950 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498542 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c31833ba-2851-45cc-9133-315b38587337-config-out\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498950 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c31833ba-2851-45cc-9133-315b38587337-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.498950 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498896 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c31833ba-2851-45cc-9133-315b38587337-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.499085 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.498982 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.499540 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.499512 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.501337 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.501205 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-web-config\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.501337 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.501298 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.501337 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.501313 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c31833ba-2851-45cc-9133-315b38587337-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.501815 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.501435 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.501815 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.501612 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.501815 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.501615 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-config\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.501815 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.501736 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.502143 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.501970 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.502202 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.502146 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.503339 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.503320 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c31833ba-2851-45cc-9133-315b38587337-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.503497 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.503460 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.503613 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.503598 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.503960 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.503943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c31833ba-2851-45cc-9133-315b38587337-config-out\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.504134 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.504113 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c31833ba-2851-45cc-9133-315b38587337-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.506220 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.506202 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz42b\" (UniqueName: \"kubernetes.io/projected/c31833ba-2851-45cc-9133-315b38587337-kube-api-access-wz42b\") pod \"prometheus-k8s-0\" (UID: \"c31833ba-2851-45cc-9133-315b38587337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.573346 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.573303 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:00.701220 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.701182 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:51:00.704022 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:51:00.703995 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc31833ba_2851_45cc_9133_315b38587337.slice/crio-7359613546e0f456d2625ade956a6574d145664dace5465d6263ab4e24eed46d WatchSource:0}: Error finding container 7359613546e0f456d2625ade956a6574d145664dace5465d6263ab4e24eed46d: Status 404 returned error can't find the container with id 7359613546e0f456d2625ade956a6574d145664dace5465d6263ab4e24eed46d Apr 22 18:51:00.908976 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.908940 2579 generic.go:358] "Generic (PLEG): container finished" podID="c31833ba-2851-45cc-9133-315b38587337" containerID="6dfbb9d500bb5d6e69d4a625b529037ea0df6f186b9f5e10894a1bdee9ba07be" exitCode=0 Apr 22 18:51:00.909131 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.909032 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c31833ba-2851-45cc-9133-315b38587337","Type":"ContainerDied","Data":"6dfbb9d500bb5d6e69d4a625b529037ea0df6f186b9f5e10894a1bdee9ba07be"} Apr 22 18:51:00.909131 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:00.909068 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c31833ba-2851-45cc-9133-315b38587337","Type":"ContainerStarted","Data":"7359613546e0f456d2625ade956a6574d145664dace5465d6263ab4e24eed46d"} Apr 22 18:51:01.914284 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:01.914199 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c31833ba-2851-45cc-9133-315b38587337","Type":"ContainerStarted","Data":"f54d742755e4750b508b914cc2722d95a0b5e2d2b967f6601a8b60d7ed4f2465"} Apr 22 18:51:01.914284 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:01.914234 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c31833ba-2851-45cc-9133-315b38587337","Type":"ContainerStarted","Data":"db21679c2433284efe1bed1b4fd1df824b380a92f6bcba914d8d64d5c0ef860c"} Apr 22 18:51:01.914284 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:01.914244 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c31833ba-2851-45cc-9133-315b38587337","Type":"ContainerStarted","Data":"7786dc6fd5bef571f139d31549b592632a9843cdfe7f548d029873525231384b"} Apr 22 18:51:01.914284 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:01.914253 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c31833ba-2851-45cc-9133-315b38587337","Type":"ContainerStarted","Data":"9463b5278f85b8e9ff6d8fe4033f11a849d1e197ba31c80f39c9a02f6a57e6cf"} Apr 22 18:51:01.914284 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:01.914263 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c31833ba-2851-45cc-9133-315b38587337","Type":"ContainerStarted","Data":"7e21a8234e6b9cc7074e737466435a220629d3301d97d4b3f8134829e547651a"} Apr 22 18:51:01.914284 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:01.914271 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c31833ba-2851-45cc-9133-315b38587337","Type":"ContainerStarted","Data":"7e8fd428a28249efe0111c739983f23b908035841e8c1c024c55b17c689eb130"} Apr 22 18:51:01.943372 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:01.943307 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.943287797 podStartE2EDuration="1.943287797s" podCreationTimestamp="2026-04-22 18:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:51:01.938247222 +0000 UTC m=+248.387029553" watchObservedRunningTime="2026-04-22 18:51:01.943287797 +0000 UTC m=+248.392070129" Apr 22 18:51:02.163770 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:02.163735 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="499f8028-1941-4aaf-a1fa-a18e381ba1ea" path="/var/lib/kubelet/pods/499f8028-1941-4aaf-a1fa-a18e381ba1ea/volumes" Apr 22 18:51:05.573858 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:05.573821 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:05.940838 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:05.940754 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:51:05.943029 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:05.943008 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89068fbb-3d09-49a2-a369-e83cb48aaf0f-metrics-certs\") pod \"network-metrics-daemon-z69xm\" (UID: \"89068fbb-3d09-49a2-a369-e83cb48aaf0f\") " pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:51:05.963046 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:05.963016 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-m94d4\"" Apr 22 18:51:05.970609 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:05.970590 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z69xm" Apr 22 18:51:06.085240 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:06.085215 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z69xm"] Apr 22 18:51:06.087742 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:51:06.087713 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89068fbb_3d09_49a2_a369_e83cb48aaf0f.slice/crio-e9b18040fbe788b969cc3cf13e0d3dc6b9734ab985261b1f975c00752344230a WatchSource:0}: Error finding container e9b18040fbe788b969cc3cf13e0d3dc6b9734ab985261b1f975c00752344230a: Status 404 returned error can't find the container with id e9b18040fbe788b969cc3cf13e0d3dc6b9734ab985261b1f975c00752344230a Apr 22 18:51:06.933612 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:06.933578 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z69xm" event={"ID":"89068fbb-3d09-49a2-a369-e83cb48aaf0f","Type":"ContainerStarted","Data":"e9b18040fbe788b969cc3cf13e0d3dc6b9734ab985261b1f975c00752344230a"} Apr 22 18:51:07.937934 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:07.937892 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z69xm" event={"ID":"89068fbb-3d09-49a2-a369-e83cb48aaf0f","Type":"ContainerStarted","Data":"288097b68d80e3a200480e16dacc0c98448a709d6ebc99551a31420fb4459934"} Apr 22 18:51:07.937934 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:07.937930 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z69xm" event={"ID":"89068fbb-3d09-49a2-a369-e83cb48aaf0f","Type":"ContainerStarted","Data":"cc26e558bf91179c827a5397f42a03d2c8664440850f63edaa86980cb15355a2"} Apr 22 18:51:07.952146 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:07.952087 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z69xm" podStartSLOduration=253.081565554 podStartE2EDuration="4m13.952069653s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="2026-04-22 18:51:06.089455155 +0000 UTC m=+252.538237464" lastFinishedPulling="2026-04-22 18:51:06.959959251 +0000 UTC m=+253.408741563" observedRunningTime="2026-04-22 18:51:07.950955452 +0000 UTC m=+254.399737783" watchObservedRunningTime="2026-04-22 18:51:07.952069653 +0000 UTC m=+254.400851983" Apr 22 18:51:20.421859 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.421770 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-c9qkp"] Apr 22 18:51:20.427540 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.427521 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c9qkp" Apr 22 18:51:20.429646 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.429620 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:51:20.432577 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.432549 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-c9qkp"] Apr 22 18:51:20.460340 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.460309 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fe6dcc87-ee88-4a3d-a648-de6bce8431df-dbus\") pod \"global-pull-secret-syncer-c9qkp\" (UID: \"fe6dcc87-ee88-4a3d-a648-de6bce8431df\") " pod="kube-system/global-pull-secret-syncer-c9qkp" Apr 22 18:51:20.460525 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.460350 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fe6dcc87-ee88-4a3d-a648-de6bce8431df-original-pull-secret\") pod \"global-pull-secret-syncer-c9qkp\" (UID: \"fe6dcc87-ee88-4a3d-a648-de6bce8431df\") " pod="kube-system/global-pull-secret-syncer-c9qkp" Apr 22 18:51:20.460525 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.460449 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fe6dcc87-ee88-4a3d-a648-de6bce8431df-kubelet-config\") pod \"global-pull-secret-syncer-c9qkp\" (UID: \"fe6dcc87-ee88-4a3d-a648-de6bce8431df\") " pod="kube-system/global-pull-secret-syncer-c9qkp" Apr 22 18:51:20.561841 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.561801 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fe6dcc87-ee88-4a3d-a648-de6bce8431df-kubelet-config\") pod \"global-pull-secret-syncer-c9qkp\" (UID: \"fe6dcc87-ee88-4a3d-a648-de6bce8431df\") " pod="kube-system/global-pull-secret-syncer-c9qkp" Apr 22 18:51:20.562033 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.561862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fe6dcc87-ee88-4a3d-a648-de6bce8431df-dbus\") pod \"global-pull-secret-syncer-c9qkp\" (UID: \"fe6dcc87-ee88-4a3d-a648-de6bce8431df\") " pod="kube-system/global-pull-secret-syncer-c9qkp" Apr 22 18:51:20.562033 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.561903 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fe6dcc87-ee88-4a3d-a648-de6bce8431df-original-pull-secret\") pod \"global-pull-secret-syncer-c9qkp\" (UID: \"fe6dcc87-ee88-4a3d-a648-de6bce8431df\") " pod="kube-system/global-pull-secret-syncer-c9qkp" Apr 22 18:51:20.562033 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.561943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fe6dcc87-ee88-4a3d-a648-de6bce8431df-kubelet-config\") pod \"global-pull-secret-syncer-c9qkp\" (UID: \"fe6dcc87-ee88-4a3d-a648-de6bce8431df\") " pod="kube-system/global-pull-secret-syncer-c9qkp" Apr 22 18:51:20.562199 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.562054 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fe6dcc87-ee88-4a3d-a648-de6bce8431df-dbus\") pod \"global-pull-secret-syncer-c9qkp\" (UID: \"fe6dcc87-ee88-4a3d-a648-de6bce8431df\") " pod="kube-system/global-pull-secret-syncer-c9qkp" Apr 22 18:51:20.564224 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.564203 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fe6dcc87-ee88-4a3d-a648-de6bce8431df-original-pull-secret\") pod \"global-pull-secret-syncer-c9qkp\" (UID: \"fe6dcc87-ee88-4a3d-a648-de6bce8431df\") " pod="kube-system/global-pull-secret-syncer-c9qkp" Apr 22 18:51:20.736810 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.736780 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c9qkp" Apr 22 18:51:20.851237 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.851202 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-c9qkp"] Apr 22 18:51:20.854343 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:51:20.854302 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe6dcc87_ee88_4a3d_a648_de6bce8431df.slice/crio-8a59e70e3f133074122c58ee0ad5d3b3cd1907550000173088756b166e669395 WatchSource:0}: Error finding container 8a59e70e3f133074122c58ee0ad5d3b3cd1907550000173088756b166e669395: Status 404 returned error can't find the container with id 8a59e70e3f133074122c58ee0ad5d3b3cd1907550000173088756b166e669395 Apr 22 18:51:20.975854 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:20.975819 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-c9qkp" event={"ID":"fe6dcc87-ee88-4a3d-a648-de6bce8431df","Type":"ContainerStarted","Data":"8a59e70e3f133074122c58ee0ad5d3b3cd1907550000173088756b166e669395"} Apr 22 18:51:24.989329 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:24.989298 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-c9qkp" event={"ID":"fe6dcc87-ee88-4a3d-a648-de6bce8431df","Type":"ContainerStarted","Data":"0194c666d32535efd11faee8dd1cf669f90db5a4aa0c43322f52e0463cc2c971"} Apr 22 18:51:25.002727 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:25.002684 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-c9qkp" podStartSLOduration=1.264175359 podStartE2EDuration="5.002669565s" podCreationTimestamp="2026-04-22 18:51:20 +0000 UTC" firstStartedPulling="2026-04-22 18:51:20.856096123 +0000 UTC m=+267.304878438" lastFinishedPulling="2026-04-22 18:51:24.594590336 +0000 UTC m=+271.043372644" observedRunningTime="2026-04-22 18:51:25.001120904 +0000 UTC m=+271.449903234" watchObservedRunningTime="2026-04-22 18:51:25.002669565 +0000 UTC m=+271.451451895" Apr 22 18:51:54.053332 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:54.053302 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 18:51:54.053767 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:54.053577 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 18:51:54.057180 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:51:54.057163 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:52:00.573549 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:52:00.573501 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:52:00.588759 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:52:00.588731 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:52:01.105477 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:52:01.105450 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:54:13.118913 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.118833 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj"] Apr 22 18:54:13.121978 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.121963 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" Apr 22 18:54:13.124955 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.124933 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 18:54:13.125093 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.124966 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 18:54:13.125093 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.125030 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-rkbqs\"" Apr 22 18:54:13.134856 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.134833 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj"] Apr 22 18:54:13.192098 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.192056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/c53fda76-d657-43ff-bebb-a616fa1dc603-operator-config\") pod \"servicemesh-operator3-55f49c5f94-nb8zj\" (UID: \"c53fda76-d657-43ff-bebb-a616fa1dc603\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" Apr 22 18:54:13.192294 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.192190 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9ch8\" (UniqueName: \"kubernetes.io/projected/c53fda76-d657-43ff-bebb-a616fa1dc603-kube-api-access-z9ch8\") pod \"servicemesh-operator3-55f49c5f94-nb8zj\" (UID: \"c53fda76-d657-43ff-bebb-a616fa1dc603\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" Apr 22 18:54:13.293458 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.293417 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9ch8\" (UniqueName: \"kubernetes.io/projected/c53fda76-d657-43ff-bebb-a616fa1dc603-kube-api-access-z9ch8\") pod \"servicemesh-operator3-55f49c5f94-nb8zj\" (UID: \"c53fda76-d657-43ff-bebb-a616fa1dc603\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" Apr 22 18:54:13.293632 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.293481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/c53fda76-d657-43ff-bebb-a616fa1dc603-operator-config\") pod \"servicemesh-operator3-55f49c5f94-nb8zj\" (UID: \"c53fda76-d657-43ff-bebb-a616fa1dc603\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" Apr 22 18:54:13.296035 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.296014 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/c53fda76-d657-43ff-bebb-a616fa1dc603-operator-config\") pod \"servicemesh-operator3-55f49c5f94-nb8zj\" (UID: \"c53fda76-d657-43ff-bebb-a616fa1dc603\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" Apr 22 18:54:13.301118 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.301091 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9ch8\" (UniqueName: \"kubernetes.io/projected/c53fda76-d657-43ff-bebb-a616fa1dc603-kube-api-access-z9ch8\") pod \"servicemesh-operator3-55f49c5f94-nb8zj\" (UID: \"c53fda76-d657-43ff-bebb-a616fa1dc603\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" Apr 22 18:54:13.430746 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.430657 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" Apr 22 18:54:13.563347 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.563311 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj"] Apr 22 18:54:13.567233 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:54:13.567207 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc53fda76_d657_43ff_bebb_a616fa1dc603.slice/crio-4041f63d736d954733ca618718d7ae4968c35d165627a22349c568bb0ae882dc WatchSource:0}: Error finding container 4041f63d736d954733ca618718d7ae4968c35d165627a22349c568bb0ae882dc: Status 404 returned error can't find the container with id 4041f63d736d954733ca618718d7ae4968c35d165627a22349c568bb0ae882dc Apr 22 18:54:13.569647 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:13.569628 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:54:14.471070 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:14.471035 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" event={"ID":"c53fda76-d657-43ff-bebb-a616fa1dc603","Type":"ContainerStarted","Data":"4041f63d736d954733ca618718d7ae4968c35d165627a22349c568bb0ae882dc"} Apr 22 18:54:16.920665 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:16.920628 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6869d4659b-496fv"] Apr 22 18:54:16.923869 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:16.923847 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:16.927277 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:16.927241 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 18:54:16.927277 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:16.927265 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5ws6l\"" Apr 22 18:54:16.927277 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:16.927269 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:54:16.927530 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:16.927298 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 18:54:16.927609 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:16.927594 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:54:16.927657 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:16.927599 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 18:54:16.947204 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:16.947172 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6869d4659b-496fv"] Apr 22 18:54:17.028097 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.028055 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/54d5d20a-93bc-49e9-a149-9bb6ff0d511d-metrics-cert\") pod \"lws-controller-manager-6869d4659b-496fv\" (UID: \"54d5d20a-93bc-49e9-a149-9bb6ff0d511d\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.028381 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.028359 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/54d5d20a-93bc-49e9-a149-9bb6ff0d511d-manager-config\") pod \"lws-controller-manager-6869d4659b-496fv\" (UID: \"54d5d20a-93bc-49e9-a149-9bb6ff0d511d\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.028579 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.028562 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj75d\" (UniqueName: \"kubernetes.io/projected/54d5d20a-93bc-49e9-a149-9bb6ff0d511d-kube-api-access-lj75d\") pod \"lws-controller-manager-6869d4659b-496fv\" (UID: \"54d5d20a-93bc-49e9-a149-9bb6ff0d511d\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.028687 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.028674 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54d5d20a-93bc-49e9-a149-9bb6ff0d511d-cert\") pod \"lws-controller-manager-6869d4659b-496fv\" (UID: \"54d5d20a-93bc-49e9-a149-9bb6ff0d511d\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.129437 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.129324 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lj75d\" (UniqueName: \"kubernetes.io/projected/54d5d20a-93bc-49e9-a149-9bb6ff0d511d-kube-api-access-lj75d\") pod \"lws-controller-manager-6869d4659b-496fv\" (UID: \"54d5d20a-93bc-49e9-a149-9bb6ff0d511d\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.129437 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.129381 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54d5d20a-93bc-49e9-a149-9bb6ff0d511d-cert\") pod \"lws-controller-manager-6869d4659b-496fv\" (UID: \"54d5d20a-93bc-49e9-a149-9bb6ff0d511d\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.129437 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.129413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/54d5d20a-93bc-49e9-a149-9bb6ff0d511d-metrics-cert\") pod \"lws-controller-manager-6869d4659b-496fv\" (UID: \"54d5d20a-93bc-49e9-a149-9bb6ff0d511d\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.129437 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.129440 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/54d5d20a-93bc-49e9-a149-9bb6ff0d511d-manager-config\") pod \"lws-controller-manager-6869d4659b-496fv\" (UID: \"54d5d20a-93bc-49e9-a149-9bb6ff0d511d\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.130246 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.130222 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/54d5d20a-93bc-49e9-a149-9bb6ff0d511d-manager-config\") pod \"lws-controller-manager-6869d4659b-496fv\" (UID: \"54d5d20a-93bc-49e9-a149-9bb6ff0d511d\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.132036 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.132016 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54d5d20a-93bc-49e9-a149-9bb6ff0d511d-cert\") pod \"lws-controller-manager-6869d4659b-496fv\" (UID: \"54d5d20a-93bc-49e9-a149-9bb6ff0d511d\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.132203 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.132182 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/54d5d20a-93bc-49e9-a149-9bb6ff0d511d-metrics-cert\") pod \"lws-controller-manager-6869d4659b-496fv\" (UID: \"54d5d20a-93bc-49e9-a149-9bb6ff0d511d\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.139664 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.139636 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj75d\" (UniqueName: \"kubernetes.io/projected/54d5d20a-93bc-49e9-a149-9bb6ff0d511d-kube-api-access-lj75d\") pod \"lws-controller-manager-6869d4659b-496fv\" (UID: \"54d5d20a-93bc-49e9-a149-9bb6ff0d511d\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.233851 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.233819 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:17.357967 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.357935 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6869d4659b-496fv"] Apr 22 18:54:17.361032 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:54:17.361005 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d5d20a_93bc_49e9_a149_9bb6ff0d511d.slice/crio-54db006885fe7fc848e77c214cb9a06e62da86491ede55fd8317b3602e0ce652 WatchSource:0}: Error finding container 54db006885fe7fc848e77c214cb9a06e62da86491ede55fd8317b3602e0ce652: Status 404 returned error can't find the container with id 54db006885fe7fc848e77c214cb9a06e62da86491ede55fd8317b3602e0ce652 Apr 22 18:54:17.483230 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.483190 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" event={"ID":"54d5d20a-93bc-49e9-a149-9bb6ff0d511d","Type":"ContainerStarted","Data":"54db006885fe7fc848e77c214cb9a06e62da86491ede55fd8317b3602e0ce652"} Apr 22 18:54:17.484570 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.484550 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" event={"ID":"c53fda76-d657-43ff-bebb-a616fa1dc603","Type":"ContainerStarted","Data":"309b5a6cf9793e66a435ac1bdd87c25b3588b9a7a2fc989e0442f2829bc3df82"} Apr 22 18:54:17.484678 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.484660 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" Apr 22 18:54:17.503964 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:17.503915 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" podStartSLOduration=1.280486375 podStartE2EDuration="4.503896157s" podCreationTimestamp="2026-04-22 18:54:13 +0000 UTC" firstStartedPulling="2026-04-22 18:54:13.569786032 +0000 UTC m=+440.018568340" lastFinishedPulling="2026-04-22 18:54:16.79319581 +0000 UTC m=+443.241978122" observedRunningTime="2026-04-22 18:54:17.502799674 +0000 UTC m=+443.951582004" watchObservedRunningTime="2026-04-22 18:54:17.503896157 +0000 UTC m=+443.952678487" Apr 22 18:54:20.496226 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:20.496184 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" event={"ID":"54d5d20a-93bc-49e9-a149-9bb6ff0d511d","Type":"ContainerStarted","Data":"83595d80bb3916c58c44f183fc29f81ba706c1e26b712b84810fd08d24ecc5db"} Apr 22 18:54:20.496664 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:20.496320 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:20.511809 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:20.511758 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" podStartSLOduration=1.821507742 podStartE2EDuration="4.511741541s" podCreationTimestamp="2026-04-22 18:54:16 +0000 UTC" firstStartedPulling="2026-04-22 18:54:17.363173291 +0000 UTC m=+443.811955600" lastFinishedPulling="2026-04-22 18:54:20.053407088 +0000 UTC m=+446.502189399" observedRunningTime="2026-04-22 18:54:20.509566528 +0000 UTC m=+446.958348869" watchObservedRunningTime="2026-04-22 18:54:20.511741541 +0000 UTC m=+446.960523870" Apr 22 18:54:28.169666 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.169631 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r"] Apr 22 18:54:28.172831 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.172812 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.174932 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.174904 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-d7rcp\"" Apr 22 18:54:28.175050 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.174909 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 18:54:28.185094 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.185060 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r"] Apr 22 18:54:28.333681 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.333637 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.333681 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.333687 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.333939 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.333713 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/32410f0c-444a-4b71-837b-09e27a6b6bb9-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.333939 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.333812 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/32410f0c-444a-4b71-837b-09e27a6b6bb9-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.333939 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.333854 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.333939 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.333885 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/32410f0c-444a-4b71-837b-09e27a6b6bb9-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.334121 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.333951 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.334121 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.333980 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjkp2\" (UniqueName: \"kubernetes.io/projected/32410f0c-444a-4b71-837b-09e27a6b6bb9-kube-api-access-sjkp2\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.334121 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.334010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435197 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435090 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435197 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435144 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435197 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/32410f0c-444a-4b71-837b-09e27a6b6bb9-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435477 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435209 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/32410f0c-444a-4b71-837b-09e27a6b6bb9-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435477 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435235 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435477 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/32410f0c-444a-4b71-837b-09e27a6b6bb9-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435477 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435325 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435477 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435347 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjkp2\" (UniqueName: \"kubernetes.io/projected/32410f0c-444a-4b71-837b-09e27a6b6bb9-kube-api-access-sjkp2\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435477 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435798 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435602 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435798 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435727 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435798 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435781 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.435931 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.435907 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.436237 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.436219 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/32410f0c-444a-4b71-837b-09e27a6b6bb9-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.437668 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.437648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/32410f0c-444a-4b71-837b-09e27a6b6bb9-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.438018 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.438000 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/32410f0c-444a-4b71-837b-09e27a6b6bb9-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.442350 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.442323 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/32410f0c-444a-4b71-837b-09e27a6b6bb9-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.442575 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.442552 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjkp2\" (UniqueName: \"kubernetes.io/projected/32410f0c-444a-4b71-837b-09e27a6b6bb9-kube-api-access-sjkp2\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-7v85r\" (UID: \"32410f0c-444a-4b71-837b-09e27a6b6bb9\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.486600 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.486559 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:28.491298 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.491271 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nb8zj" Apr 22 18:54:28.620919 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:28.620892 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r"] Apr 22 18:54:28.625948 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:54:28.625914 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32410f0c_444a_4b71_837b_09e27a6b6bb9.slice/crio-99e4ec0106f658aef16c5eeceafca2ceeff51d2d0e4aa461f9b8a45e7f260426 WatchSource:0}: Error finding container 99e4ec0106f658aef16c5eeceafca2ceeff51d2d0e4aa461f9b8a45e7f260426: Status 404 returned error can't find the container with id 99e4ec0106f658aef16c5eeceafca2ceeff51d2d0e4aa461f9b8a45e7f260426 Apr 22 18:54:29.525600 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:29.525563 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" event={"ID":"32410f0c-444a-4b71-837b-09e27a6b6bb9","Type":"ContainerStarted","Data":"99e4ec0106f658aef16c5eeceafca2ceeff51d2d0e4aa461f9b8a45e7f260426"} Apr 22 18:54:31.276518 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:31.276448 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:54:31.276920 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:31.276567 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:54:31.276920 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:31.276614 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:54:31.501929 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:31.501822 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-496fv" Apr 22 18:54:31.538430 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:31.538340 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" event={"ID":"32410f0c-444a-4b71-837b-09e27a6b6bb9","Type":"ContainerStarted","Data":"229b1ff23c62a88ef3f68fdffbc3e40f8a4e4f67c285688b623586f4e9d90210"} Apr 22 18:54:31.556443 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:31.556393 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" podStartSLOduration=0.908135115 podStartE2EDuration="3.556374933s" podCreationTimestamp="2026-04-22 18:54:28 +0000 UTC" firstStartedPulling="2026-04-22 18:54:28.627894746 +0000 UTC m=+455.076677054" lastFinishedPulling="2026-04-22 18:54:31.276134553 +0000 UTC m=+457.724916872" observedRunningTime="2026-04-22 18:54:31.554147995 +0000 UTC m=+458.002930327" watchObservedRunningTime="2026-04-22 18:54:31.556374933 +0000 UTC m=+458.005157263" Apr 22 18:54:32.487149 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:32.487112 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:32.491879 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:32.491851 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:32.542070 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:32.542033 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:32.543184 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:32.543163 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-7v85r" Apr 22 18:54:53.198362 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:53.198326 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp"] Apr 22 18:54:53.211003 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:53.210977 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp" Apr 22 18:54:53.217041 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:53.216475 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:54:53.217041 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:53.216588 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 22 18:54:53.217641 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:53.217616 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:54:53.217769 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:53.217624 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-862cq\"" Apr 22 18:54:53.219687 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:53.219667 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp"] Apr 22 18:54:53.256840 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:53.256803 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmtct\" (UniqueName: \"kubernetes.io/projected/7962152c-6d85-4334-acaf-14c9a21560ce-kube-api-access-kmtct\") pod \"dns-operator-controller-manager-844548ff4c-cwlxp\" (UID: \"7962152c-6d85-4334-acaf-14c9a21560ce\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp" Apr 22 18:54:53.357286 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:53.357251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmtct\" (UniqueName: \"kubernetes.io/projected/7962152c-6d85-4334-acaf-14c9a21560ce-kube-api-access-kmtct\") pod \"dns-operator-controller-manager-844548ff4c-cwlxp\" (UID: \"7962152c-6d85-4334-acaf-14c9a21560ce\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp" Apr 22 18:54:53.365887 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:53.365857 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmtct\" (UniqueName: \"kubernetes.io/projected/7962152c-6d85-4334-acaf-14c9a21560ce-kube-api-access-kmtct\") pod \"dns-operator-controller-manager-844548ff4c-cwlxp\" (UID: \"7962152c-6d85-4334-acaf-14c9a21560ce\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp" Apr 22 18:54:53.527191 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:53.527157 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp" Apr 22 18:54:53.653545 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:53.653517 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp"] Apr 22 18:54:53.656219 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:54:53.656188 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7962152c_6d85_4334_acaf_14c9a21560ce.slice/crio-eb32027993293417c91570d6e3363b111992d34e49a12592125977d9368217a0 WatchSource:0}: Error finding container eb32027993293417c91570d6e3363b111992d34e49a12592125977d9368217a0: Status 404 returned error can't find the container with id eb32027993293417c91570d6e3363b111992d34e49a12592125977d9368217a0 Apr 22 18:54:54.612943 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:54.612906 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp" event={"ID":"7962152c-6d85-4334-acaf-14c9a21560ce","Type":"ContainerStarted","Data":"eb32027993293417c91570d6e3363b111992d34e49a12592125977d9368217a0"} Apr 22 18:54:56.774030 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:56.774002 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 22 18:54:57.625525 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:57.625469 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp" event={"ID":"7962152c-6d85-4334-acaf-14c9a21560ce","Type":"ContainerStarted","Data":"f988c616beb15022e99aeb0805982160635b0ea753940ce10d463b17d9237e7b"} Apr 22 18:54:57.625734 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:57.625713 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp" Apr 22 18:54:57.641860 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:54:57.641810 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp" podStartSLOduration=1.528100764 podStartE2EDuration="4.641791602s" podCreationTimestamp="2026-04-22 18:54:53 +0000 UTC" firstStartedPulling="2026-04-22 18:54:53.658181805 +0000 UTC m=+480.106964113" lastFinishedPulling="2026-04-22 18:54:56.771872642 +0000 UTC m=+483.220654951" observedRunningTime="2026-04-22 18:54:57.641010078 +0000 UTC m=+484.089792409" watchObservedRunningTime="2026-04-22 18:54:57.641791602 +0000 UTC m=+484.090573934" Apr 22 18:55:08.631587 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:08.631556 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-cwlxp" Apr 22 18:55:52.002441 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.002404 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fndf6"] Apr 22 18:55:52.004909 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.004892 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" Apr 22 18:55:52.007250 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.007230 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-2fl6w\"" Apr 22 18:55:52.007341 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.007313 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 18:55:52.010534 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.010507 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fndf6"] Apr 22 18:55:52.077733 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.077698 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06-config-file\") pod \"limitador-limitador-64c8f475fb-fndf6\" (UID: \"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" Apr 22 18:55:52.077925 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.077803 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b644f\" (UniqueName: \"kubernetes.io/projected/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06-kube-api-access-b644f\") pod \"limitador-limitador-64c8f475fb-fndf6\" (UID: \"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" Apr 22 18:55:52.105347 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.105312 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fndf6"] Apr 22 18:55:52.178270 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.178228 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06-config-file\") pod \"limitador-limitador-64c8f475fb-fndf6\" (UID: \"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" Apr 22 18:55:52.178452 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.178299 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b644f\" (UniqueName: \"kubernetes.io/projected/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06-kube-api-access-b644f\") pod \"limitador-limitador-64c8f475fb-fndf6\" (UID: \"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" Apr 22 18:55:52.178916 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.178897 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06-config-file\") pod \"limitador-limitador-64c8f475fb-fndf6\" (UID: \"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" Apr 22 18:55:52.185769 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.185734 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b644f\" (UniqueName: \"kubernetes.io/projected/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06-kube-api-access-b644f\") pod \"limitador-limitador-64c8f475fb-fndf6\" (UID: \"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" Apr 22 18:55:52.316104 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.316003 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" Apr 22 18:55:52.445849 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.445822 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fndf6"] Apr 22 18:55:52.448564 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:55:52.448527 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29b35b2d_d1a7_49c2_b6e1_b3eb2fec1c06.slice/crio-013389a5f02fd6d91f383a96780afeb3ae2821d2c8c31fa3d9fcbfc5cac0fa2c WatchSource:0}: Error finding container 013389a5f02fd6d91f383a96780afeb3ae2821d2c8c31fa3d9fcbfc5cac0fa2c: Status 404 returned error can't find the container with id 013389a5f02fd6d91f383a96780afeb3ae2821d2c8c31fa3d9fcbfc5cac0fa2c Apr 22 18:55:52.786958 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.786921 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-62m52"] Apr 22 18:55:52.790250 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.790224 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-62m52" Apr 22 18:55:52.792337 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.792316 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-mltsm\"" Apr 22 18:55:52.796142 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.796111 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-62m52"] Apr 22 18:55:52.799983 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.799957 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" event={"ID":"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06","Type":"ContainerStarted","Data":"013389a5f02fd6d91f383a96780afeb3ae2821d2c8c31fa3d9fcbfc5cac0fa2c"} Apr 22 18:55:52.886183 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.886151 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz5tk\" (UniqueName: \"kubernetes.io/projected/364e2c0a-c8fe-4584-8d63-950c2ccb8f8a-kube-api-access-bz5tk\") pod \"authorino-674b59b84c-62m52\" (UID: \"364e2c0a-c8fe-4584-8d63-950c2ccb8f8a\") " pod="kuadrant-system/authorino-674b59b84c-62m52" Apr 22 18:55:52.970011 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.969972 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8jk44"] Apr 22 18:55:52.972349 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.972327 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8jk44" Apr 22 18:55:52.977936 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.977905 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8jk44"] Apr 22 18:55:52.986906 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.986869 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bz5tk\" (UniqueName: \"kubernetes.io/projected/364e2c0a-c8fe-4584-8d63-950c2ccb8f8a-kube-api-access-bz5tk\") pod \"authorino-674b59b84c-62m52\" (UID: \"364e2c0a-c8fe-4584-8d63-950c2ccb8f8a\") " pod="kuadrant-system/authorino-674b59b84c-62m52" Apr 22 18:55:52.994013 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:52.993992 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz5tk\" (UniqueName: \"kubernetes.io/projected/364e2c0a-c8fe-4584-8d63-950c2ccb8f8a-kube-api-access-bz5tk\") pod \"authorino-674b59b84c-62m52\" (UID: \"364e2c0a-c8fe-4584-8d63-950c2ccb8f8a\") " pod="kuadrant-system/authorino-674b59b84c-62m52" Apr 22 18:55:53.088349 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:53.088265 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bhng\" (UniqueName: \"kubernetes.io/projected/ef2880a6-ea08-4b2c-868d-ff622b7d1e6d-kube-api-access-4bhng\") pod \"authorino-79cbc94b89-8jk44\" (UID: \"ef2880a6-ea08-4b2c-868d-ff622b7d1e6d\") " pod="kuadrant-system/authorino-79cbc94b89-8jk44" Apr 22 18:55:53.102097 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:53.102062 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-62m52" Apr 22 18:55:53.189054 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:53.189018 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bhng\" (UniqueName: \"kubernetes.io/projected/ef2880a6-ea08-4b2c-868d-ff622b7d1e6d-kube-api-access-4bhng\") pod \"authorino-79cbc94b89-8jk44\" (UID: \"ef2880a6-ea08-4b2c-868d-ff622b7d1e6d\") " pod="kuadrant-system/authorino-79cbc94b89-8jk44" Apr 22 18:55:53.197563 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:53.197530 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bhng\" (UniqueName: \"kubernetes.io/projected/ef2880a6-ea08-4b2c-868d-ff622b7d1e6d-kube-api-access-4bhng\") pod \"authorino-79cbc94b89-8jk44\" (UID: \"ef2880a6-ea08-4b2c-868d-ff622b7d1e6d\") " pod="kuadrant-system/authorino-79cbc94b89-8jk44" Apr 22 18:55:53.217616 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:53.217590 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-62m52"] Apr 22 18:55:53.220141 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:55:53.220108 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod364e2c0a_c8fe_4584_8d63_950c2ccb8f8a.slice/crio-f1261398275f3bf0f75f6313ab6eba30946b3af8597d01ca78879a644365c25d WatchSource:0}: Error finding container f1261398275f3bf0f75f6313ab6eba30946b3af8597d01ca78879a644365c25d: Status 404 returned error can't find the container with id f1261398275f3bf0f75f6313ab6eba30946b3af8597d01ca78879a644365c25d Apr 22 18:55:53.282044 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:53.282012 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8jk44" Apr 22 18:55:53.403244 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:53.403215 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8jk44"] Apr 22 18:55:53.405521 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:55:53.405468 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef2880a6_ea08_4b2c_868d_ff622b7d1e6d.slice/crio-889b9dfff65d09e8fe7e2e76ce02ec54167587afe4015e9f84d3810689769801 WatchSource:0}: Error finding container 889b9dfff65d09e8fe7e2e76ce02ec54167587afe4015e9f84d3810689769801: Status 404 returned error can't find the container with id 889b9dfff65d09e8fe7e2e76ce02ec54167587afe4015e9f84d3810689769801 Apr 22 18:55:53.804469 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:53.804432 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8jk44" event={"ID":"ef2880a6-ea08-4b2c-868d-ff622b7d1e6d","Type":"ContainerStarted","Data":"889b9dfff65d09e8fe7e2e76ce02ec54167587afe4015e9f84d3810689769801"} Apr 22 18:55:53.805504 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:53.805458 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-62m52" event={"ID":"364e2c0a-c8fe-4584-8d63-950c2ccb8f8a","Type":"ContainerStarted","Data":"f1261398275f3bf0f75f6313ab6eba30946b3af8597d01ca78879a644365c25d"} Apr 22 18:55:58.828161 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:58.828123 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" event={"ID":"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06","Type":"ContainerStarted","Data":"4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7"} Apr 22 18:55:58.828641 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:58.828231 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" Apr 22 18:55:58.829413 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:58.829385 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-62m52" event={"ID":"364e2c0a-c8fe-4584-8d63-950c2ccb8f8a","Type":"ContainerStarted","Data":"80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1"} Apr 22 18:55:58.830551 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:58.830529 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8jk44" event={"ID":"ef2880a6-ea08-4b2c-868d-ff622b7d1e6d","Type":"ContainerStarted","Data":"5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a"} Apr 22 18:55:58.842476 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:58.842412 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" podStartSLOduration=2.214399004 podStartE2EDuration="7.842395592s" podCreationTimestamp="2026-04-22 18:55:51 +0000 UTC" firstStartedPulling="2026-04-22 18:55:52.450619661 +0000 UTC m=+538.899401973" lastFinishedPulling="2026-04-22 18:55:58.078616241 +0000 UTC m=+544.527398561" observedRunningTime="2026-04-22 18:55:58.841406443 +0000 UTC m=+545.290188794" watchObservedRunningTime="2026-04-22 18:55:58.842395592 +0000 UTC m=+545.291177923" Apr 22 18:55:58.854748 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:58.854698 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-62m52" podStartSLOduration=2.007938801 podStartE2EDuration="6.854682284s" podCreationTimestamp="2026-04-22 18:55:52 +0000 UTC" firstStartedPulling="2026-04-22 18:55:53.221687519 +0000 UTC m=+539.670469826" lastFinishedPulling="2026-04-22 18:55:58.068431001 +0000 UTC m=+544.517213309" observedRunningTime="2026-04-22 18:55:58.853289933 +0000 UTC m=+545.302072265" watchObservedRunningTime="2026-04-22 18:55:58.854682284 +0000 UTC m=+545.303464635" Apr 22 18:55:58.865937 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:58.865878 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-8jk44" podStartSLOduration=2.204304991 podStartE2EDuration="6.865858559s" podCreationTimestamp="2026-04-22 18:55:52 +0000 UTC" firstStartedPulling="2026-04-22 18:55:53.407083185 +0000 UTC m=+539.855865498" lastFinishedPulling="2026-04-22 18:55:58.068636743 +0000 UTC m=+544.517419066" observedRunningTime="2026-04-22 18:55:58.865231148 +0000 UTC m=+545.314013490" watchObservedRunningTime="2026-04-22 18:55:58.865858559 +0000 UTC m=+545.314640891" Apr 22 18:55:58.889930 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:55:58.889896 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-62m52"] Apr 22 18:56:00.837249 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:00.837210 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-62m52" podUID="364e2c0a-c8fe-4584-8d63-950c2ccb8f8a" containerName="authorino" containerID="cri-o://80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1" gracePeriod=30 Apr 22 18:56:01.075175 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.075150 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-62m52" Apr 22 18:56:01.170150 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.170055 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz5tk\" (UniqueName: \"kubernetes.io/projected/364e2c0a-c8fe-4584-8d63-950c2ccb8f8a-kube-api-access-bz5tk\") pod \"364e2c0a-c8fe-4584-8d63-950c2ccb8f8a\" (UID: \"364e2c0a-c8fe-4584-8d63-950c2ccb8f8a\") " Apr 22 18:56:01.172229 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.172198 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/364e2c0a-c8fe-4584-8d63-950c2ccb8f8a-kube-api-access-bz5tk" (OuterVolumeSpecName: "kube-api-access-bz5tk") pod "364e2c0a-c8fe-4584-8d63-950c2ccb8f8a" (UID: "364e2c0a-c8fe-4584-8d63-950c2ccb8f8a"). InnerVolumeSpecName "kube-api-access-bz5tk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:56:01.271839 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.271788 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bz5tk\" (UniqueName: \"kubernetes.io/projected/364e2c0a-c8fe-4584-8d63-950c2ccb8f8a-kube-api-access-bz5tk\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:56:01.842031 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.841993 2579 generic.go:358] "Generic (PLEG): container finished" podID="364e2c0a-c8fe-4584-8d63-950c2ccb8f8a" containerID="80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1" exitCode=0 Apr 22 18:56:01.842552 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.842042 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-62m52" event={"ID":"364e2c0a-c8fe-4584-8d63-950c2ccb8f8a","Type":"ContainerDied","Data":"80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1"} Apr 22 18:56:01.842552 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.842061 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-62m52" Apr 22 18:56:01.842552 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.842090 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-62m52" event={"ID":"364e2c0a-c8fe-4584-8d63-950c2ccb8f8a","Type":"ContainerDied","Data":"f1261398275f3bf0f75f6313ab6eba30946b3af8597d01ca78879a644365c25d"} Apr 22 18:56:01.842552 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.842112 2579 scope.go:117] "RemoveContainer" containerID="80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1" Apr 22 18:56:01.851336 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.851316 2579 scope.go:117] "RemoveContainer" containerID="80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1" Apr 22 18:56:01.851662 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:56:01.851636 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1\": container with ID starting with 80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1 not found: ID does not exist" containerID="80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1" Apr 22 18:56:01.851721 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.851673 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1"} err="failed to get container status \"80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1\": rpc error: code = NotFound desc = could not find container \"80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1\": container with ID starting with 80177bf68922a49d02880b809fc2c027f9c2b27e46d542b8ab35be15d743eea1 not found: ID does not exist" Apr 22 18:56:01.862430 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.862393 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-62m52"] Apr 22 18:56:01.864139 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:01.864116 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-62m52"] Apr 22 18:56:02.163735 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:02.163659 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="364e2c0a-c8fe-4584-8d63-950c2ccb8f8a" path="/var/lib/kubelet/pods/364e2c0a-c8fe-4584-8d63-950c2ccb8f8a/volumes" Apr 22 18:56:05.766044 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:05.766005 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fndf6"] Apr 22 18:56:05.766546 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:05.766255 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" podUID="29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06" containerName="limitador" containerID="cri-o://4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7" gracePeriod=30 Apr 22 18:56:05.768517 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:05.768477 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" Apr 22 18:56:06.305093 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.305066 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" Apr 22 18:56:06.416000 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.415896 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06-config-file\") pod \"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06\" (UID: \"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06\") " Apr 22 18:56:06.416000 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.415964 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b644f\" (UniqueName: \"kubernetes.io/projected/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06-kube-api-access-b644f\") pod \"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06\" (UID: \"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06\") " Apr 22 18:56:06.416322 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.416302 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06-config-file" (OuterVolumeSpecName: "config-file") pod "29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06" (UID: "29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:56:06.418103 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.418074 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06-kube-api-access-b644f" (OuterVolumeSpecName: "kube-api-access-b644f") pod "29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06" (UID: "29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06"). InnerVolumeSpecName "kube-api-access-b644f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:56:06.517625 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.517592 2579 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06-config-file\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:56:06.517625 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.517623 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b644f\" (UniqueName: \"kubernetes.io/projected/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06-kube-api-access-b644f\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:56:06.859800 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.859759 2579 generic.go:358] "Generic (PLEG): container finished" podID="29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06" containerID="4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7" exitCode=0 Apr 22 18:56:06.860198 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.859853 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" Apr 22 18:56:06.860198 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.859845 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" event={"ID":"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06","Type":"ContainerDied","Data":"4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7"} Apr 22 18:56:06.860198 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.859951 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-fndf6" event={"ID":"29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06","Type":"ContainerDied","Data":"013389a5f02fd6d91f383a96780afeb3ae2821d2c8c31fa3d9fcbfc5cac0fa2c"} Apr 22 18:56:06.860198 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.859970 2579 scope.go:117] "RemoveContainer" containerID="4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7" Apr 22 18:56:06.868223 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.868205 2579 scope.go:117] "RemoveContainer" containerID="4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7" Apr 22 18:56:06.868462 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:56:06.868443 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7\": container with ID starting with 4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7 not found: ID does not exist" containerID="4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7" Apr 22 18:56:06.868546 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.868471 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7"} err="failed to get container status \"4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7\": rpc error: code = NotFound desc = could not find container \"4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7\": container with ID starting with 4ca533b1ed531ff86fe9206034ee8633be8b37b6e641b7891a56387ffc82efc7 not found: ID does not exist" Apr 22 18:56:06.878875 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.878838 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fndf6"] Apr 22 18:56:06.881765 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:06.881738 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fndf6"] Apr 22 18:56:08.164227 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:08.164195 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06" path="/var/lib/kubelet/pods/29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06/volumes" Apr 22 18:56:14.178192 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.178157 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-9txrc"] Apr 22 18:56:14.178597 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.178457 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06" containerName="limitador" Apr 22 18:56:14.178597 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.178467 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06" containerName="limitador" Apr 22 18:56:14.178597 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.178474 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="364e2c0a-c8fe-4584-8d63-950c2ccb8f8a" containerName="authorino" Apr 22 18:56:14.178597 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.178480 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="364e2c0a-c8fe-4584-8d63-950c2ccb8f8a" containerName="authorino" Apr 22 18:56:14.178597 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.178545 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="29b35b2d-d1a7-49c2-b6e1-b3eb2fec1c06" containerName="limitador" Apr 22 18:56:14.178597 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.178554 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="364e2c0a-c8fe-4584-8d63-950c2ccb8f8a" containerName="authorino" Apr 22 18:56:14.180584 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.180566 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-9txrc" Apr 22 18:56:14.182515 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.182477 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 18:56:14.186351 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.186323 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-9txrc"] Apr 22 18:56:14.275755 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.275715 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzrd\" (UniqueName: \"kubernetes.io/projected/8ab037c0-ba08-477b-9b37-f4d50d3e7104-kube-api-access-pdzrd\") pod \"authorino-68bd676465-9txrc\" (UID: \"8ab037c0-ba08-477b-9b37-f4d50d3e7104\") " pod="kuadrant-system/authorino-68bd676465-9txrc" Apr 22 18:56:14.275936 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.275776 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8ab037c0-ba08-477b-9b37-f4d50d3e7104-tls-cert\") pod \"authorino-68bd676465-9txrc\" (UID: \"8ab037c0-ba08-477b-9b37-f4d50d3e7104\") " pod="kuadrant-system/authorino-68bd676465-9txrc" Apr 22 18:56:14.376266 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.376228 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzrd\" (UniqueName: \"kubernetes.io/projected/8ab037c0-ba08-477b-9b37-f4d50d3e7104-kube-api-access-pdzrd\") pod \"authorino-68bd676465-9txrc\" (UID: \"8ab037c0-ba08-477b-9b37-f4d50d3e7104\") " pod="kuadrant-system/authorino-68bd676465-9txrc" Apr 22 18:56:14.376422 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.376283 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8ab037c0-ba08-477b-9b37-f4d50d3e7104-tls-cert\") pod \"authorino-68bd676465-9txrc\" (UID: \"8ab037c0-ba08-477b-9b37-f4d50d3e7104\") " pod="kuadrant-system/authorino-68bd676465-9txrc" Apr 22 18:56:14.378719 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.378700 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8ab037c0-ba08-477b-9b37-f4d50d3e7104-tls-cert\") pod \"authorino-68bd676465-9txrc\" (UID: \"8ab037c0-ba08-477b-9b37-f4d50d3e7104\") " pod="kuadrant-system/authorino-68bd676465-9txrc" Apr 22 18:56:14.383188 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.383167 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzrd\" (UniqueName: \"kubernetes.io/projected/8ab037c0-ba08-477b-9b37-f4d50d3e7104-kube-api-access-pdzrd\") pod \"authorino-68bd676465-9txrc\" (UID: \"8ab037c0-ba08-477b-9b37-f4d50d3e7104\") " pod="kuadrant-system/authorino-68bd676465-9txrc" Apr 22 18:56:14.489646 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.489614 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-9txrc" Apr 22 18:56:14.607417 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.607386 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-9txrc"] Apr 22 18:56:14.609583 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:56:14.609554 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ab037c0_ba08_477b_9b37_f4d50d3e7104.slice/crio-37bcdd4b318eb813780e37cce29383806c1b8132a02dc4da6f62589be56edbae WatchSource:0}: Error finding container 37bcdd4b318eb813780e37cce29383806c1b8132a02dc4da6f62589be56edbae: Status 404 returned error can't find the container with id 37bcdd4b318eb813780e37cce29383806c1b8132a02dc4da6f62589be56edbae Apr 22 18:56:14.888602 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:14.888511 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-9txrc" event={"ID":"8ab037c0-ba08-477b-9b37-f4d50d3e7104","Type":"ContainerStarted","Data":"37bcdd4b318eb813780e37cce29383806c1b8132a02dc4da6f62589be56edbae"} Apr 22 18:56:15.892768 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:15.892731 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-9txrc" event={"ID":"8ab037c0-ba08-477b-9b37-f4d50d3e7104","Type":"ContainerStarted","Data":"62773a12bc8f39f6a8f954ffbf2e760b35e00530905fc0dca819be8a897a3158"} Apr 22 18:56:15.906330 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:15.906280 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-9txrc" podStartSLOduration=1.309536777 podStartE2EDuration="1.906264377s" podCreationTimestamp="2026-04-22 18:56:14 +0000 UTC" firstStartedPulling="2026-04-22 18:56:14.611346506 +0000 UTC m=+561.060128814" lastFinishedPulling="2026-04-22 18:56:15.208074103 +0000 UTC m=+561.656856414" observedRunningTime="2026-04-22 18:56:15.904927091 +0000 UTC m=+562.353709423" watchObservedRunningTime="2026-04-22 18:56:15.906264377 +0000 UTC m=+562.355046707" Apr 22 18:56:15.930042 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:15.930006 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8jk44"] Apr 22 18:56:15.930236 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:15.930215 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-8jk44" podUID="ef2880a6-ea08-4b2c-868d-ff622b7d1e6d" containerName="authorino" containerID="cri-o://5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a" gracePeriod=30 Apr 22 18:56:16.180098 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.180075 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8jk44" Apr 22 18:56:16.191968 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.191940 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bhng\" (UniqueName: \"kubernetes.io/projected/ef2880a6-ea08-4b2c-868d-ff622b7d1e6d-kube-api-access-4bhng\") pod \"ef2880a6-ea08-4b2c-868d-ff622b7d1e6d\" (UID: \"ef2880a6-ea08-4b2c-868d-ff622b7d1e6d\") " Apr 22 18:56:16.194518 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.194464 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef2880a6-ea08-4b2c-868d-ff622b7d1e6d-kube-api-access-4bhng" (OuterVolumeSpecName: "kube-api-access-4bhng") pod "ef2880a6-ea08-4b2c-868d-ff622b7d1e6d" (UID: "ef2880a6-ea08-4b2c-868d-ff622b7d1e6d"). InnerVolumeSpecName "kube-api-access-4bhng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:56:16.292867 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.292811 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4bhng\" (UniqueName: \"kubernetes.io/projected/ef2880a6-ea08-4b2c-868d-ff622b7d1e6d-kube-api-access-4bhng\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:56:16.897104 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.897069 2579 generic.go:358] "Generic (PLEG): container finished" podID="ef2880a6-ea08-4b2c-868d-ff622b7d1e6d" containerID="5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a" exitCode=0 Apr 22 18:56:16.897569 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.897119 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8jk44" Apr 22 18:56:16.897569 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.897155 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8jk44" event={"ID":"ef2880a6-ea08-4b2c-868d-ff622b7d1e6d","Type":"ContainerDied","Data":"5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a"} Apr 22 18:56:16.897569 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.897188 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8jk44" event={"ID":"ef2880a6-ea08-4b2c-868d-ff622b7d1e6d","Type":"ContainerDied","Data":"889b9dfff65d09e8fe7e2e76ce02ec54167587afe4015e9f84d3810689769801"} Apr 22 18:56:16.897569 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.897205 2579 scope.go:117] "RemoveContainer" containerID="5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a" Apr 22 18:56:16.905251 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.905234 2579 scope.go:117] "RemoveContainer" containerID="5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a" Apr 22 18:56:16.905517 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:56:16.905476 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a\": container with ID starting with 5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a not found: ID does not exist" containerID="5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a" Apr 22 18:56:16.905575 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.905526 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a"} err="failed to get container status \"5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a\": rpc error: code = NotFound desc = could not find container \"5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a\": container with ID starting with 5fc4fa2208b6895bb96355590073209bac7d6164a21551f0988394db8264230a not found: ID does not exist" Apr 22 18:56:16.915167 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.915137 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8jk44"] Apr 22 18:56:16.918317 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:16.918295 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8jk44"] Apr 22 18:56:18.163711 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:18.163677 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef2880a6-ea08-4b2c-868d-ff622b7d1e6d" path="/var/lib/kubelet/pods/ef2880a6-ea08-4b2c-868d-ff622b7d1e6d/volumes" Apr 22 18:56:31.976160 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:31.976121 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-8qdcw"] Apr 22 18:56:31.976577 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:31.976480 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef2880a6-ea08-4b2c-868d-ff622b7d1e6d" containerName="authorino" Apr 22 18:56:31.976577 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:31.976510 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2880a6-ea08-4b2c-868d-ff622b7d1e6d" containerName="authorino" Apr 22 18:56:31.976577 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:31.976570 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef2880a6-ea08-4b2c-868d-ff622b7d1e6d" containerName="authorino" Apr 22 18:56:31.997776 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:31.997740 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-8qdcw"] Apr 22 18:56:31.997950 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:31.997887 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" Apr 22 18:56:32.000078 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.000050 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 18:56:32.000223 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.000095 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-zgg5s\"" Apr 22 18:56:32.000307 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.000239 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:56:32.000848 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.000825 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:56:32.029435 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.029406 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-mwktm"] Apr 22 18:56:32.032453 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.032429 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-mwktm" Apr 22 18:56:32.035345 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.035319 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:56:32.035497 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.035367 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-xnqnt\"" Apr 22 18:56:32.046266 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.046225 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-mwktm"] Apr 22 18:56:32.133174 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.133132 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdl74\" (UniqueName: \"kubernetes.io/projected/f7cb84e7-925f-4bc7-91bf-925046078dcc-kube-api-access-pdl74\") pod \"seaweedfs-86cc847c5c-mwktm\" (UID: \"f7cb84e7-925f-4bc7-91bf-925046078dcc\") " pod="kserve/seaweedfs-86cc847c5c-mwktm" Apr 22 18:56:32.133376 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.133211 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f7cb84e7-925f-4bc7-91bf-925046078dcc-data\") pod \"seaweedfs-86cc847c5c-mwktm\" (UID: \"f7cb84e7-925f-4bc7-91bf-925046078dcc\") " pod="kserve/seaweedfs-86cc847c5c-mwktm" Apr 22 18:56:32.133376 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.133246 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/692c017d-e670-4aba-b208-a7e20fddb740-cert\") pod \"kserve-controller-manager-6f655776dd-8qdcw\" (UID: \"692c017d-e670-4aba-b208-a7e20fddb740\") " pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" Apr 22 18:56:32.133376 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.133273 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k29ms\" (UniqueName: \"kubernetes.io/projected/692c017d-e670-4aba-b208-a7e20fddb740-kube-api-access-k29ms\") pod \"kserve-controller-manager-6f655776dd-8qdcw\" (UID: \"692c017d-e670-4aba-b208-a7e20fddb740\") " pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" Apr 22 18:56:32.234021 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.233983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f7cb84e7-925f-4bc7-91bf-925046078dcc-data\") pod \"seaweedfs-86cc847c5c-mwktm\" (UID: \"f7cb84e7-925f-4bc7-91bf-925046078dcc\") " pod="kserve/seaweedfs-86cc847c5c-mwktm" Apr 22 18:56:32.234205 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.234033 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/692c017d-e670-4aba-b208-a7e20fddb740-cert\") pod \"kserve-controller-manager-6f655776dd-8qdcw\" (UID: \"692c017d-e670-4aba-b208-a7e20fddb740\") " pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" Apr 22 18:56:32.234205 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.234059 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k29ms\" (UniqueName: \"kubernetes.io/projected/692c017d-e670-4aba-b208-a7e20fddb740-kube-api-access-k29ms\") pod \"kserve-controller-manager-6f655776dd-8qdcw\" (UID: \"692c017d-e670-4aba-b208-a7e20fddb740\") " pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" Apr 22 18:56:32.234205 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.234116 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdl74\" (UniqueName: \"kubernetes.io/projected/f7cb84e7-925f-4bc7-91bf-925046078dcc-kube-api-access-pdl74\") pod \"seaweedfs-86cc847c5c-mwktm\" (UID: \"f7cb84e7-925f-4bc7-91bf-925046078dcc\") " pod="kserve/seaweedfs-86cc847c5c-mwktm" Apr 22 18:56:32.234370 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.234347 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f7cb84e7-925f-4bc7-91bf-925046078dcc-data\") pod \"seaweedfs-86cc847c5c-mwktm\" (UID: \"f7cb84e7-925f-4bc7-91bf-925046078dcc\") " pod="kserve/seaweedfs-86cc847c5c-mwktm" Apr 22 18:56:32.236572 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.236551 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/692c017d-e670-4aba-b208-a7e20fddb740-cert\") pod \"kserve-controller-manager-6f655776dd-8qdcw\" (UID: \"692c017d-e670-4aba-b208-a7e20fddb740\") " pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" Apr 22 18:56:32.244963 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.244939 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k29ms\" (UniqueName: \"kubernetes.io/projected/692c017d-e670-4aba-b208-a7e20fddb740-kube-api-access-k29ms\") pod \"kserve-controller-manager-6f655776dd-8qdcw\" (UID: \"692c017d-e670-4aba-b208-a7e20fddb740\") " pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" Apr 22 18:56:32.256139 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.256110 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdl74\" (UniqueName: \"kubernetes.io/projected/f7cb84e7-925f-4bc7-91bf-925046078dcc-kube-api-access-pdl74\") pod \"seaweedfs-86cc847c5c-mwktm\" (UID: \"f7cb84e7-925f-4bc7-91bf-925046078dcc\") " pod="kserve/seaweedfs-86cc847c5c-mwktm" Apr 22 18:56:32.309787 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.309754 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" Apr 22 18:56:32.348299 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.348258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-mwktm" Apr 22 18:56:32.483366 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.483339 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-mwktm"] Apr 22 18:56:32.485381 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:56:32.485310 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7cb84e7_925f_4bc7_91bf_925046078dcc.slice/crio-bc2e5fbc061ef2e90dca15af34aff8ace39021290e806c270b803681be748796 WatchSource:0}: Error finding container bc2e5fbc061ef2e90dca15af34aff8ace39021290e806c270b803681be748796: Status 404 returned error can't find the container with id bc2e5fbc061ef2e90dca15af34aff8ace39021290e806c270b803681be748796 Apr 22 18:56:32.661137 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.661112 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-8qdcw"] Apr 22 18:56:32.663216 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:56:32.663186 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod692c017d_e670_4aba_b208_a7e20fddb740.slice/crio-2552ecb4b2b7711ed56d7d47a2b19021eb5ce7798c6a174447733ad3a6a10e21 WatchSource:0}: Error finding container 2552ecb4b2b7711ed56d7d47a2b19021eb5ce7798c6a174447733ad3a6a10e21: Status 404 returned error can't find the container with id 2552ecb4b2b7711ed56d7d47a2b19021eb5ce7798c6a174447733ad3a6a10e21 Apr 22 18:56:32.958678 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.958590 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" event={"ID":"692c017d-e670-4aba-b208-a7e20fddb740","Type":"ContainerStarted","Data":"2552ecb4b2b7711ed56d7d47a2b19021eb5ce7798c6a174447733ad3a6a10e21"} Apr 22 18:56:32.959851 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:32.959819 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-mwktm" event={"ID":"f7cb84e7-925f-4bc7-91bf-925046078dcc","Type":"ContainerStarted","Data":"bc2e5fbc061ef2e90dca15af34aff8ace39021290e806c270b803681be748796"} Apr 22 18:56:36.974391 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:36.974354 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-mwktm" event={"ID":"f7cb84e7-925f-4bc7-91bf-925046078dcc","Type":"ContainerStarted","Data":"14583ab653964d8fb0052565340e3c2dd2660310bc22f44867d048f6b703c121"} Apr 22 18:56:36.974876 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:36.974454 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-mwktm" Apr 22 18:56:36.975698 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:36.975676 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" event={"ID":"692c017d-e670-4aba-b208-a7e20fddb740","Type":"ContainerStarted","Data":"bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f"} Apr 22 18:56:36.975828 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:36.975820 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" Apr 22 18:56:36.988693 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:36.988637 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-mwktm" podStartSLOduration=1.153051572 podStartE2EDuration="4.988621985s" podCreationTimestamp="2026-04-22 18:56:32 +0000 UTC" firstStartedPulling="2026-04-22 18:56:32.486596943 +0000 UTC m=+578.935379251" lastFinishedPulling="2026-04-22 18:56:36.322167341 +0000 UTC m=+582.770949664" observedRunningTime="2026-04-22 18:56:36.988567272 +0000 UTC m=+583.437349601" watchObservedRunningTime="2026-04-22 18:56:36.988621985 +0000 UTC m=+583.437404316" Apr 22 18:56:37.003077 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:37.003025 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" podStartSLOduration=2.443249785 podStartE2EDuration="6.003010813s" podCreationTimestamp="2026-04-22 18:56:31 +0000 UTC" firstStartedPulling="2026-04-22 18:56:32.664581227 +0000 UTC m=+579.113363535" lastFinishedPulling="2026-04-22 18:56:36.224342254 +0000 UTC m=+582.673124563" observedRunningTime="2026-04-22 18:56:37.001579553 +0000 UTC m=+583.450361880" watchObservedRunningTime="2026-04-22 18:56:37.003010813 +0000 UTC m=+583.451793142" Apr 22 18:56:42.981056 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:42.981024 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-mwktm" Apr 22 18:56:54.076903 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:54.076869 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 18:56:54.077340 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:56:54.076946 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 18:57:07.984317 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:07.984284 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" Apr 22 18:57:09.439338 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.439292 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-8qdcw"] Apr 22 18:57:09.439855 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.439612 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" podUID="692c017d-e670-4aba-b208-a7e20fddb740" containerName="manager" containerID="cri-o://bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f" gracePeriod=10 Apr 22 18:57:09.457175 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.457141 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-8k9k2"] Apr 22 18:57:09.498969 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.498948 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-8k9k2"] Apr 22 18:57:09.499073 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.499063 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" Apr 22 18:57:09.570519 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.570296 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4bzz\" (UniqueName: \"kubernetes.io/projected/d7d33c80-4ec6-4055-8a15-13659877acea-kube-api-access-x4bzz\") pod \"kserve-controller-manager-6f655776dd-8k9k2\" (UID: \"d7d33c80-4ec6-4055-8a15-13659877acea\") " pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" Apr 22 18:57:09.570519 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.570349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7d33c80-4ec6-4055-8a15-13659877acea-cert\") pod \"kserve-controller-manager-6f655776dd-8k9k2\" (UID: \"d7d33c80-4ec6-4055-8a15-13659877acea\") " pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" Apr 22 18:57:09.671002 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.670967 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4bzz\" (UniqueName: \"kubernetes.io/projected/d7d33c80-4ec6-4055-8a15-13659877acea-kube-api-access-x4bzz\") pod \"kserve-controller-manager-6f655776dd-8k9k2\" (UID: \"d7d33c80-4ec6-4055-8a15-13659877acea\") " pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" Apr 22 18:57:09.671194 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.671010 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7d33c80-4ec6-4055-8a15-13659877acea-cert\") pod \"kserve-controller-manager-6f655776dd-8k9k2\" (UID: \"d7d33c80-4ec6-4055-8a15-13659877acea\") " pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" Apr 22 18:57:09.673498 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.673457 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7d33c80-4ec6-4055-8a15-13659877acea-cert\") pod \"kserve-controller-manager-6f655776dd-8k9k2\" (UID: \"d7d33c80-4ec6-4055-8a15-13659877acea\") " pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" Apr 22 18:57:09.678454 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.678427 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4bzz\" (UniqueName: \"kubernetes.io/projected/d7d33c80-4ec6-4055-8a15-13659877acea-kube-api-access-x4bzz\") pod \"kserve-controller-manager-6f655776dd-8k9k2\" (UID: \"d7d33c80-4ec6-4055-8a15-13659877acea\") " pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" Apr 22 18:57:09.699584 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.699532 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" Apr 22 18:57:09.772130 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.772093 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/692c017d-e670-4aba-b208-a7e20fddb740-cert\") pod \"692c017d-e670-4aba-b208-a7e20fddb740\" (UID: \"692c017d-e670-4aba-b208-a7e20fddb740\") " Apr 22 18:57:09.772288 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.772145 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k29ms\" (UniqueName: \"kubernetes.io/projected/692c017d-e670-4aba-b208-a7e20fddb740-kube-api-access-k29ms\") pod \"692c017d-e670-4aba-b208-a7e20fddb740\" (UID: \"692c017d-e670-4aba-b208-a7e20fddb740\") " Apr 22 18:57:09.774274 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.774239 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/692c017d-e670-4aba-b208-a7e20fddb740-cert" (OuterVolumeSpecName: "cert") pod "692c017d-e670-4aba-b208-a7e20fddb740" (UID: "692c017d-e670-4aba-b208-a7e20fddb740"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:57:09.774383 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.774314 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/692c017d-e670-4aba-b208-a7e20fddb740-kube-api-access-k29ms" (OuterVolumeSpecName: "kube-api-access-k29ms") pod "692c017d-e670-4aba-b208-a7e20fddb740" (UID: "692c017d-e670-4aba-b208-a7e20fddb740"). InnerVolumeSpecName "kube-api-access-k29ms". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:57:09.848968 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.848936 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" Apr 22 18:57:09.872980 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.872947 2579 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/692c017d-e670-4aba-b208-a7e20fddb740-cert\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:57:09.872980 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.872979 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k29ms\" (UniqueName: \"kubernetes.io/projected/692c017d-e670-4aba-b208-a7e20fddb740-kube-api-access-k29ms\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:57:09.968021 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:09.967997 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-8k9k2"] Apr 22 18:57:09.969904 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:57:09.969879 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7d33c80_4ec6_4055_8a15_13659877acea.slice/crio-c72e6e526dc8a3b01715c57b51054d01af647906bc42d899a944c39717d8902f WatchSource:0}: Error finding container c72e6e526dc8a3b01715c57b51054d01af647906bc42d899a944c39717d8902f: Status 404 returned error can't find the container with id c72e6e526dc8a3b01715c57b51054d01af647906bc42d899a944c39717d8902f Apr 22 18:57:10.085301 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:10.085263 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" event={"ID":"d7d33c80-4ec6-4055-8a15-13659877acea","Type":"ContainerStarted","Data":"c72e6e526dc8a3b01715c57b51054d01af647906bc42d899a944c39717d8902f"} Apr 22 18:57:10.086179 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:10.086154 2579 generic.go:358] "Generic (PLEG): container finished" podID="692c017d-e670-4aba-b208-a7e20fddb740" containerID="bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f" exitCode=0 Apr 22 18:57:10.086263 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:10.086218 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" Apr 22 18:57:10.086300 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:10.086211 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" event={"ID":"692c017d-e670-4aba-b208-a7e20fddb740","Type":"ContainerDied","Data":"bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f"} Apr 22 18:57:10.086336 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:10.086319 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-8qdcw" event={"ID":"692c017d-e670-4aba-b208-a7e20fddb740","Type":"ContainerDied","Data":"2552ecb4b2b7711ed56d7d47a2b19021eb5ce7798c6a174447733ad3a6a10e21"} Apr 22 18:57:10.086368 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:10.086335 2579 scope.go:117] "RemoveContainer" containerID="bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f" Apr 22 18:57:10.094423 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:10.094404 2579 scope.go:117] "RemoveContainer" containerID="bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f" Apr 22 18:57:10.094725 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:57:10.094705 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f\": container with ID starting with bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f not found: ID does not exist" containerID="bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f" Apr 22 18:57:10.094777 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:10.094735 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f"} err="failed to get container status \"bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f\": rpc error: code = NotFound desc = could not find container \"bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f\": container with ID starting with bfbacb039a315220a69769adbce3044a484b225712501933a9ea5d04ae7efe9f not found: ID does not exist" Apr 22 18:57:10.105410 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:10.105382 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-8qdcw"] Apr 22 18:57:10.109371 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:10.109348 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-8qdcw"] Apr 22 18:57:10.164107 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:10.164074 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="692c017d-e670-4aba-b208-a7e20fddb740" path="/var/lib/kubelet/pods/692c017d-e670-4aba-b208-a7e20fddb740/volumes" Apr 22 18:57:11.090682 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:11.090640 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" event={"ID":"d7d33c80-4ec6-4055-8a15-13659877acea","Type":"ContainerStarted","Data":"af95ebb217e6f6264cab318333af5e2b1a029931454425ea265c2880f8c644ee"} Apr 22 18:57:11.091181 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:11.090726 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" Apr 22 18:57:11.104764 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:11.104704 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" podStartSLOduration=1.559967784 podStartE2EDuration="2.104685057s" podCreationTimestamp="2026-04-22 18:57:09 +0000 UTC" firstStartedPulling="2026-04-22 18:57:09.971176828 +0000 UTC m=+616.419959136" lastFinishedPulling="2026-04-22 18:57:10.515894097 +0000 UTC m=+616.964676409" observedRunningTime="2026-04-22 18:57:11.103727044 +0000 UTC m=+617.552509390" watchObservedRunningTime="2026-04-22 18:57:11.104685057 +0000 UTC m=+617.553467388" Apr 22 18:57:42.099699 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:42.099668 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-8k9k2" Apr 22 18:57:43.047506 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.047462 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-7dnmq"] Apr 22 18:57:43.047909 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.047890 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="692c017d-e670-4aba-b208-a7e20fddb740" containerName="manager" Apr 22 18:57:43.047999 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.047912 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="692c017d-e670-4aba-b208-a7e20fddb740" containerName="manager" Apr 22 18:57:43.047999 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.047990 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="692c017d-e670-4aba-b208-a7e20fddb740" containerName="manager" Apr 22 18:57:43.050862 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.050835 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7dnmq" Apr 22 18:57:43.052874 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.052848 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 18:57:43.053090 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.053078 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-2nk5c\"" Apr 22 18:57:43.059808 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.059782 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7dnmq"] Apr 22 18:57:43.162095 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.162066 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5473aab8-dcb2-4ca5-975f-b326a6d109e0-cert\") pod \"odh-model-controller-696fc77849-7dnmq\" (UID: \"5473aab8-dcb2-4ca5-975f-b326a6d109e0\") " pod="kserve/odh-model-controller-696fc77849-7dnmq" Apr 22 18:57:43.162095 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.162097 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v4mq\" (UniqueName: \"kubernetes.io/projected/5473aab8-dcb2-4ca5-975f-b326a6d109e0-kube-api-access-7v4mq\") pod \"odh-model-controller-696fc77849-7dnmq\" (UID: \"5473aab8-dcb2-4ca5-975f-b326a6d109e0\") " pod="kserve/odh-model-controller-696fc77849-7dnmq" Apr 22 18:57:43.263668 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.263627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5473aab8-dcb2-4ca5-975f-b326a6d109e0-cert\") pod \"odh-model-controller-696fc77849-7dnmq\" (UID: \"5473aab8-dcb2-4ca5-975f-b326a6d109e0\") " pod="kserve/odh-model-controller-696fc77849-7dnmq" Apr 22 18:57:43.263668 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.263672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7v4mq\" (UniqueName: \"kubernetes.io/projected/5473aab8-dcb2-4ca5-975f-b326a6d109e0-kube-api-access-7v4mq\") pod \"odh-model-controller-696fc77849-7dnmq\" (UID: \"5473aab8-dcb2-4ca5-975f-b326a6d109e0\") " pod="kserve/odh-model-controller-696fc77849-7dnmq" Apr 22 18:57:43.263939 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:57:43.263780 2579 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 18:57:43.263939 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:57:43.263844 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5473aab8-dcb2-4ca5-975f-b326a6d109e0-cert podName:5473aab8-dcb2-4ca5-975f-b326a6d109e0 nodeName:}" failed. No retries permitted until 2026-04-22 18:57:43.763827036 +0000 UTC m=+650.212609343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5473aab8-dcb2-4ca5-975f-b326a6d109e0-cert") pod "odh-model-controller-696fc77849-7dnmq" (UID: "5473aab8-dcb2-4ca5-975f-b326a6d109e0") : secret "odh-model-controller-webhook-cert" not found Apr 22 18:57:43.274936 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.274904 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v4mq\" (UniqueName: \"kubernetes.io/projected/5473aab8-dcb2-4ca5-975f-b326a6d109e0-kube-api-access-7v4mq\") pod \"odh-model-controller-696fc77849-7dnmq\" (UID: \"5473aab8-dcb2-4ca5-975f-b326a6d109e0\") " pod="kserve/odh-model-controller-696fc77849-7dnmq" Apr 22 18:57:43.768856 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.768819 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5473aab8-dcb2-4ca5-975f-b326a6d109e0-cert\") pod \"odh-model-controller-696fc77849-7dnmq\" (UID: \"5473aab8-dcb2-4ca5-975f-b326a6d109e0\") " pod="kserve/odh-model-controller-696fc77849-7dnmq" Apr 22 18:57:43.771211 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.771181 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5473aab8-dcb2-4ca5-975f-b326a6d109e0-cert\") pod \"odh-model-controller-696fc77849-7dnmq\" (UID: \"5473aab8-dcb2-4ca5-975f-b326a6d109e0\") " pod="kserve/odh-model-controller-696fc77849-7dnmq" Apr 22 18:57:43.962792 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:43.962737 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7dnmq" Apr 22 18:57:44.105121 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:44.105096 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7dnmq"] Apr 22 18:57:44.107207 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:57:44.107177 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5473aab8_dcb2_4ca5_975f_b326a6d109e0.slice/crio-e46dcaea1a051260da3573d129b4c611078453bbd6a7fdce7e2e66c3a4d323a8 WatchSource:0}: Error finding container e46dcaea1a051260da3573d129b4c611078453bbd6a7fdce7e2e66c3a4d323a8: Status 404 returned error can't find the container with id e46dcaea1a051260da3573d129b4c611078453bbd6a7fdce7e2e66c3a4d323a8 Apr 22 18:57:44.194701 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:44.194670 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7dnmq" event={"ID":"5473aab8-dcb2-4ca5-975f-b326a6d109e0","Type":"ContainerStarted","Data":"e46dcaea1a051260da3573d129b4c611078453bbd6a7fdce7e2e66c3a4d323a8"} Apr 22 18:57:47.206148 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:47.206113 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7dnmq" event={"ID":"5473aab8-dcb2-4ca5-975f-b326a6d109e0","Type":"ContainerStarted","Data":"97bc0e2ed452cea8f24d520699772ec2b3cda911dd5b156dd2940e9304a89637"} Apr 22 18:57:47.206701 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:47.206275 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-7dnmq" Apr 22 18:57:47.220961 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:47.220917 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-7dnmq" podStartSLOduration=1.80092107 podStartE2EDuration="4.220904472s" podCreationTimestamp="2026-04-22 18:57:43 +0000 UTC" firstStartedPulling="2026-04-22 18:57:44.10845325 +0000 UTC m=+650.557235558" lastFinishedPulling="2026-04-22 18:57:46.528436652 +0000 UTC m=+652.977218960" observedRunningTime="2026-04-22 18:57:47.219689748 +0000 UTC m=+653.668472081" watchObservedRunningTime="2026-04-22 18:57:47.220904472 +0000 UTC m=+653.669686832" Apr 22 18:57:58.212521 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:58.212473 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-7dnmq" Apr 22 18:57:59.020422 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:59.020385 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-cxc57"] Apr 22 18:57:59.024900 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:59.024882 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-cxc57" Apr 22 18:57:59.029450 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:59.029423 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-cxc57"] Apr 22 18:57:59.102711 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:59.102678 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5nz\" (UniqueName: \"kubernetes.io/projected/91a0099e-2e5f-4851-8224-cb91f9501782-kube-api-access-rv5nz\") pod \"s3-init-cxc57\" (UID: \"91a0099e-2e5f-4851-8224-cb91f9501782\") " pod="kserve/s3-init-cxc57" Apr 22 18:57:59.203130 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:59.203086 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5nz\" (UniqueName: \"kubernetes.io/projected/91a0099e-2e5f-4851-8224-cb91f9501782-kube-api-access-rv5nz\") pod \"s3-init-cxc57\" (UID: \"91a0099e-2e5f-4851-8224-cb91f9501782\") " pod="kserve/s3-init-cxc57" Apr 22 18:57:59.210481 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:59.210444 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5nz\" (UniqueName: \"kubernetes.io/projected/91a0099e-2e5f-4851-8224-cb91f9501782-kube-api-access-rv5nz\") pod \"s3-init-cxc57\" (UID: \"91a0099e-2e5f-4851-8224-cb91f9501782\") " pod="kserve/s3-init-cxc57" Apr 22 18:57:59.334643 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:59.334555 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-cxc57" Apr 22 18:57:59.452837 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:57:59.452801 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-cxc57"] Apr 22 18:57:59.455650 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:57:59.455618 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a0099e_2e5f_4851_8224_cb91f9501782.slice/crio-47ae001d26530689ebc4a16180a8634001728f1beb17846c868de15912eb1ae7 WatchSource:0}: Error finding container 47ae001d26530689ebc4a16180a8634001728f1beb17846c868de15912eb1ae7: Status 404 returned error can't find the container with id 47ae001d26530689ebc4a16180a8634001728f1beb17846c868de15912eb1ae7 Apr 22 18:58:00.250741 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:00.250681 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-cxc57" event={"ID":"91a0099e-2e5f-4851-8224-cb91f9501782","Type":"ContainerStarted","Data":"47ae001d26530689ebc4a16180a8634001728f1beb17846c868de15912eb1ae7"} Apr 22 18:58:04.266724 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:04.266686 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-cxc57" event={"ID":"91a0099e-2e5f-4851-8224-cb91f9501782","Type":"ContainerStarted","Data":"bd9a8fa97f5bfbadb6f7821424e72e10902111ca7e3d7f44777265caddb392a3"} Apr 22 18:58:04.280710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:04.280661 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-cxc57" podStartSLOduration=0.847245291 podStartE2EDuration="5.280647388s" podCreationTimestamp="2026-04-22 18:57:59 +0000 UTC" firstStartedPulling="2026-04-22 18:57:59.457412119 +0000 UTC m=+665.906194427" lastFinishedPulling="2026-04-22 18:58:03.890814213 +0000 UTC m=+670.339596524" observedRunningTime="2026-04-22 18:58:04.278410578 +0000 UTC m=+670.727192910" watchObservedRunningTime="2026-04-22 18:58:04.280647388 +0000 UTC m=+670.729429718" Apr 22 18:58:07.279594 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:07.279507 2579 generic.go:358] "Generic (PLEG): container finished" podID="91a0099e-2e5f-4851-8224-cb91f9501782" containerID="bd9a8fa97f5bfbadb6f7821424e72e10902111ca7e3d7f44777265caddb392a3" exitCode=0 Apr 22 18:58:07.279943 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:07.279582 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-cxc57" event={"ID":"91a0099e-2e5f-4851-8224-cb91f9501782","Type":"ContainerDied","Data":"bd9a8fa97f5bfbadb6f7821424e72e10902111ca7e3d7f44777265caddb392a3"} Apr 22 18:58:08.416788 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:08.416763 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-cxc57" Apr 22 18:58:08.487037 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:08.487004 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv5nz\" (UniqueName: \"kubernetes.io/projected/91a0099e-2e5f-4851-8224-cb91f9501782-kube-api-access-rv5nz\") pod \"91a0099e-2e5f-4851-8224-cb91f9501782\" (UID: \"91a0099e-2e5f-4851-8224-cb91f9501782\") " Apr 22 18:58:08.489062 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:08.489034 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a0099e-2e5f-4851-8224-cb91f9501782-kube-api-access-rv5nz" (OuterVolumeSpecName: "kube-api-access-rv5nz") pod "91a0099e-2e5f-4851-8224-cb91f9501782" (UID: "91a0099e-2e5f-4851-8224-cb91f9501782"). InnerVolumeSpecName "kube-api-access-rv5nz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:58:08.588281 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:08.588196 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rv5nz\" (UniqueName: \"kubernetes.io/projected/91a0099e-2e5f-4851-8224-cb91f9501782-kube-api-access-rv5nz\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:58:09.288181 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:09.288145 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-cxc57" event={"ID":"91a0099e-2e5f-4851-8224-cb91f9501782","Type":"ContainerDied","Data":"47ae001d26530689ebc4a16180a8634001728f1beb17846c868de15912eb1ae7"} Apr 22 18:58:09.288181 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:09.288180 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47ae001d26530689ebc4a16180a8634001728f1beb17846c868de15912eb1ae7" Apr 22 18:58:09.288387 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:09.288197 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-cxc57" Apr 22 18:58:19.922008 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.921969 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s"] Apr 22 18:58:19.922521 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.922440 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91a0099e-2e5f-4851-8224-cb91f9501782" containerName="s3-init" Apr 22 18:58:19.922521 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.922455 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a0099e-2e5f-4851-8224-cb91f9501782" containerName="s3-init" Apr 22 18:58:19.922651 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.922575 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="91a0099e-2e5f-4851-8224-cb91f9501782" containerName="s3-init" Apr 22 18:58:19.924980 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.924954 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:19.929952 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.929925 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:58:19.930210 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.930192 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 22 18:58:19.930441 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.930424 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:58:19.930665 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.930647 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-ww4lg\"" Apr 22 18:58:19.939127 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.939099 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s"] Apr 22 18:58:19.999828 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.999797 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:19.999994 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.999832 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:19.999994 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.999868 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e2e10ab8-e572-41fb-a750-a15455e243af-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:19.999994 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.999933 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjfnb\" (UniqueName: \"kubernetes.io/projected/e2e10ab8-e572-41fb-a750-a15455e243af-kube-api-access-hjfnb\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:19.999994 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:19.999980 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.000129 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.000010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.000129 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.000045 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e2e10ab8-e572-41fb-a750-a15455e243af-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.000129 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.000071 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.000129 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.000087 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e2e10ab8-e572-41fb-a750-a15455e243af-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101075 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101036 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101256 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101087 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101256 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e2e10ab8-e572-41fb-a750-a15455e243af-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101256 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101256 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101191 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e2e10ab8-e572-41fb-a750-a15455e243af-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101512 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101260 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101512 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101512 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101326 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e2e10ab8-e572-41fb-a750-a15455e243af-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101512 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101354 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjfnb\" (UniqueName: \"kubernetes.io/projected/e2e10ab8-e572-41fb-a750-a15455e243af-kube-api-access-hjfnb\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101740 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101526 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101740 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101543 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101740 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101576 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.101888 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.101868 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.102163 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.102138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e2e10ab8-e572-41fb-a750-a15455e243af-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.103773 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.103748 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e2e10ab8-e572-41fb-a750-a15455e243af-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.103983 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.103965 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e2e10ab8-e572-41fb-a750-a15455e243af-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.108237 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.108214 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e2e10ab8-e572-41fb-a750-a15455e243af-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.108331 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.108310 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjfnb\" (UniqueName: \"kubernetes.io/projected/e2e10ab8-e572-41fb-a750-a15455e243af-kube-api-access-hjfnb\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4rw4s\" (UID: \"e2e10ab8-e572-41fb-a750-a15455e243af\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.245152 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.245115 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:20.381977 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.381952 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s"] Apr 22 18:58:20.384398 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:58:20.384367 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2e10ab8_e572_41fb_a750_a15455e243af.slice/crio-6ed4afd92f07c0ed6bd9134ba30266cc9fab64cef99c92a72bfd2fa42b808460 WatchSource:0}: Error finding container 6ed4afd92f07c0ed6bd9134ba30266cc9fab64cef99c92a72bfd2fa42b808460: Status 404 returned error can't find the container with id 6ed4afd92f07c0ed6bd9134ba30266cc9fab64cef99c92a72bfd2fa42b808460 Apr 22 18:58:20.386673 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.386632 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:58:20.386753 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.386725 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:58:20.386820 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:20.386768 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 18:58:21.328022 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:21.327977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" event={"ID":"e2e10ab8-e572-41fb-a750-a15455e243af","Type":"ContainerStarted","Data":"5703f7c775e795ab372d3af2c80803638941765d155042b2383ea28d7df6e400"} Apr 22 18:58:21.328022 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:21.328026 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" event={"ID":"e2e10ab8-e572-41fb-a750-a15455e243af","Type":"ContainerStarted","Data":"6ed4afd92f07c0ed6bd9134ba30266cc9fab64cef99c92a72bfd2fa42b808460"} Apr 22 18:58:21.346810 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:21.346745 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" podStartSLOduration=2.346725322 podStartE2EDuration="2.346725322s" podCreationTimestamp="2026-04-22 18:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:58:21.343789091 +0000 UTC m=+687.792571422" watchObservedRunningTime="2026-04-22 18:58:21.346725322 +0000 UTC m=+687.795507653" Apr 22 18:58:22.246045 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:22.246007 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:22.251114 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:22.251088 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:22.331175 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:22.331142 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:22.332123 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:22.332104 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4rw4s" Apr 22 18:58:31.813111 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:31.813072 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj"] Apr 22 18:58:31.816608 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:31.816586 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:31.820234 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:31.820206 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 18:58:31.820375 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:31.820290 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mhhb\"" Apr 22 18:58:31.829663 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:31.829635 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj"] Apr 22 18:58:32.008343 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.008294 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-home\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.008557 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.008407 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-dshm\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.008557 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.008433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-model-cache\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.008557 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.008506 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9aeb0920-0d2a-445c-a167-798726d06c3d-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.008557 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.008539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnxw9\" (UniqueName: \"kubernetes.io/projected/9aeb0920-0d2a-445c-a167-798726d06c3d-kube-api-access-xnxw9\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.008760 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.008639 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.109535 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.109424 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.109535 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.109478 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-home\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.109743 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.109573 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-dshm\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.109743 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.109605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-model-cache\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.109743 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.109629 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9aeb0920-0d2a-445c-a167-798726d06c3d-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.109743 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.109655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnxw9\" (UniqueName: \"kubernetes.io/projected/9aeb0920-0d2a-445c-a167-798726d06c3d-kube-api-access-xnxw9\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.109947 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.109885 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.109947 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.109915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-home\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.110051 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.109961 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-model-cache\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.111854 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.111831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-dshm\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.112103 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.112083 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9aeb0920-0d2a-445c-a167-798726d06c3d-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.117174 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.117148 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnxw9\" (UniqueName: \"kubernetes.io/projected/9aeb0920-0d2a-445c-a167-798726d06c3d-kube-api-access-xnxw9\") pod \"scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.130036 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.130006 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:32.261012 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.260983 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj"] Apr 22 18:58:32.264656 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:58:32.264610 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aeb0920_0d2a_445c_a167_798726d06c3d.slice/crio-2bbf23024c7a32ddd8e314a3ced1a7b1a7a8abc58ce55a47cd80b19bbd318941 WatchSource:0}: Error finding container 2bbf23024c7a32ddd8e314a3ced1a7b1a7a8abc58ce55a47cd80b19bbd318941: Status 404 returned error can't find the container with id 2bbf23024c7a32ddd8e314a3ced1a7b1a7a8abc58ce55a47cd80b19bbd318941 Apr 22 18:58:32.366899 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:32.366812 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" event={"ID":"9aeb0920-0d2a-445c-a167-798726d06c3d","Type":"ContainerStarted","Data":"2bbf23024c7a32ddd8e314a3ced1a7b1a7a8abc58ce55a47cd80b19bbd318941"} Apr 22 18:58:36.383754 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:36.383711 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" event={"ID":"9aeb0920-0d2a-445c-a167-798726d06c3d","Type":"ContainerStarted","Data":"5eb3f79f8701fd69adcd299563d70b857e6300d427fa43137218083cdf84712a"} Apr 22 18:58:41.403807 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:41.403769 2579 generic.go:358] "Generic (PLEG): container finished" podID="9aeb0920-0d2a-445c-a167-798726d06c3d" containerID="5eb3f79f8701fd69adcd299563d70b857e6300d427fa43137218083cdf84712a" exitCode=0 Apr 22 18:58:41.404274 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:41.403837 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" event={"ID":"9aeb0920-0d2a-445c-a167-798726d06c3d","Type":"ContainerDied","Data":"5eb3f79f8701fd69adcd299563d70b857e6300d427fa43137218083cdf84712a"} Apr 22 18:58:43.412783 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:43.412745 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" event={"ID":"9aeb0920-0d2a-445c-a167-798726d06c3d","Type":"ContainerStarted","Data":"7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3"} Apr 22 18:58:43.430470 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:43.430399 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" podStartSLOduration=2.097343885 podStartE2EDuration="12.430376869s" podCreationTimestamp="2026-04-22 18:58:31 +0000 UTC" firstStartedPulling="2026-04-22 18:58:32.267164574 +0000 UTC m=+698.715946881" lastFinishedPulling="2026-04-22 18:58:42.600197553 +0000 UTC m=+709.048979865" observedRunningTime="2026-04-22 18:58:43.427786839 +0000 UTC m=+709.876569188" watchObservedRunningTime="2026-04-22 18:58:43.430376869 +0000 UTC m=+709.879159200" Apr 22 18:58:52.131159 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:52.131116 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:52.131159 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:52.131166 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:52.143925 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:52.143897 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:58:52.456759 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:58:52.456683 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:59:23.577365 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.577328 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj"] Apr 22 18:59:23.577900 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.577661 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" podUID="9aeb0920-0d2a-445c-a167-798726d06c3d" containerName="main" containerID="cri-o://7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3" gracePeriod=30 Apr 22 18:59:23.822667 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.822644 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:59:23.914272 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.914186 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-dshm\") pod \"9aeb0920-0d2a-445c-a167-798726d06c3d\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " Apr 22 18:59:23.914272 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.914220 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9aeb0920-0d2a-445c-a167-798726d06c3d-tls-certs\") pod \"9aeb0920-0d2a-445c-a167-798726d06c3d\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " Apr 22 18:59:23.914272 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.914269 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnxw9\" (UniqueName: \"kubernetes.io/projected/9aeb0920-0d2a-445c-a167-798726d06c3d-kube-api-access-xnxw9\") pod \"9aeb0920-0d2a-445c-a167-798726d06c3d\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " Apr 22 18:59:23.914579 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.914286 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-home\") pod \"9aeb0920-0d2a-445c-a167-798726d06c3d\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " Apr 22 18:59:23.914579 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.914310 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-kserve-provision-location\") pod \"9aeb0920-0d2a-445c-a167-798726d06c3d\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " Apr 22 18:59:23.914579 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.914331 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-model-cache\") pod \"9aeb0920-0d2a-445c-a167-798726d06c3d\" (UID: \"9aeb0920-0d2a-445c-a167-798726d06c3d\") " Apr 22 18:59:23.914723 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.914615 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-home" (OuterVolumeSpecName: "home") pod "9aeb0920-0d2a-445c-a167-798726d06c3d" (UID: "9aeb0920-0d2a-445c-a167-798726d06c3d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:23.914763 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.914708 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-model-cache" (OuterVolumeSpecName: "model-cache") pod "9aeb0920-0d2a-445c-a167-798726d06c3d" (UID: "9aeb0920-0d2a-445c-a167-798726d06c3d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:23.916605 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.916575 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aeb0920-0d2a-445c-a167-798726d06c3d-kube-api-access-xnxw9" (OuterVolumeSpecName: "kube-api-access-xnxw9") pod "9aeb0920-0d2a-445c-a167-798726d06c3d" (UID: "9aeb0920-0d2a-445c-a167-798726d06c3d"). InnerVolumeSpecName "kube-api-access-xnxw9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:59:23.916710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.916601 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aeb0920-0d2a-445c-a167-798726d06c3d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9aeb0920-0d2a-445c-a167-798726d06c3d" (UID: "9aeb0920-0d2a-445c-a167-798726d06c3d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:59:23.916710 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.916674 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-dshm" (OuterVolumeSpecName: "dshm") pod "9aeb0920-0d2a-445c-a167-798726d06c3d" (UID: "9aeb0920-0d2a-445c-a167-798726d06c3d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:23.970151 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:23.970107 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9aeb0920-0d2a-445c-a167-798726d06c3d" (UID: "9aeb0920-0d2a-445c-a167-798726d06c3d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:24.015581 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.015530 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-dshm\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:59:24.015581 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.015575 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9aeb0920-0d2a-445c-a167-798726d06c3d-tls-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:59:24.015581 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.015586 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnxw9\" (UniqueName: \"kubernetes.io/projected/9aeb0920-0d2a-445c-a167-798726d06c3d-kube-api-access-xnxw9\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:59:24.015581 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.015594 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-home\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:59:24.015581 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.015604 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-kserve-provision-location\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:59:24.015878 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.015614 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9aeb0920-0d2a-445c-a167-798726d06c3d-model-cache\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 18:59:24.559277 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.559242 2579 generic.go:358] "Generic (PLEG): container finished" podID="9aeb0920-0d2a-445c-a167-798726d06c3d" containerID="7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3" exitCode=0 Apr 22 18:59:24.559442 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.559299 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" event={"ID":"9aeb0920-0d2a-445c-a167-798726d06c3d","Type":"ContainerDied","Data":"7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3"} Apr 22 18:59:24.559442 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.559318 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" Apr 22 18:59:24.559442 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.559333 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj" event={"ID":"9aeb0920-0d2a-445c-a167-798726d06c3d","Type":"ContainerDied","Data":"2bbf23024c7a32ddd8e314a3ced1a7b1a7a8abc58ce55a47cd80b19bbd318941"} Apr 22 18:59:24.559442 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.559352 2579 scope.go:117] "RemoveContainer" containerID="7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3" Apr 22 18:59:24.567211 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.567191 2579 scope.go:117] "RemoveContainer" containerID="5eb3f79f8701fd69adcd299563d70b857e6300d427fa43137218083cdf84712a" Apr 22 18:59:24.576408 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.576369 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj"] Apr 22 18:59:24.579424 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.579403 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-666878ff4d-bgxmj"] Apr 22 18:59:24.627172 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.626996 2579 scope.go:117] "RemoveContainer" containerID="7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3" Apr 22 18:59:24.627400 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:59:24.627378 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3\": container with ID starting with 7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3 not found: ID does not exist" containerID="7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3" Apr 22 18:59:24.627459 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.627410 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3"} err="failed to get container status \"7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3\": rpc error: code = NotFound desc = could not find container \"7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3\": container with ID starting with 7d226032c5bbb16c06b952bb65a51ed0778c9aec4cceae530bd624a39348bbf3 not found: ID does not exist" Apr 22 18:59:24.627459 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.627431 2579 scope.go:117] "RemoveContainer" containerID="5eb3f79f8701fd69adcd299563d70b857e6300d427fa43137218083cdf84712a" Apr 22 18:59:24.627838 ip-10-0-136-53 kubenswrapper[2579]: E0422 18:59:24.627806 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eb3f79f8701fd69adcd299563d70b857e6300d427fa43137218083cdf84712a\": container with ID starting with 5eb3f79f8701fd69adcd299563d70b857e6300d427fa43137218083cdf84712a not found: ID does not exist" containerID="5eb3f79f8701fd69adcd299563d70b857e6300d427fa43137218083cdf84712a" Apr 22 18:59:24.627905 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:24.627844 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eb3f79f8701fd69adcd299563d70b857e6300d427fa43137218083cdf84712a"} err="failed to get container status \"5eb3f79f8701fd69adcd299563d70b857e6300d427fa43137218083cdf84712a\": rpc error: code = NotFound desc = could not find container \"5eb3f79f8701fd69adcd299563d70b857e6300d427fa43137218083cdf84712a\": container with ID starting with 5eb3f79f8701fd69adcd299563d70b857e6300d427fa43137218083cdf84712a not found: ID does not exist" Apr 22 18:59:26.164667 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:26.164632 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aeb0920-0d2a-445c-a167-798726d06c3d" path="/var/lib/kubelet/pods/9aeb0920-0d2a-445c-a167-798726d06c3d/volumes" Apr 22 18:59:30.417023 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.416986 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl"] Apr 22 18:59:30.417529 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.417510 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9aeb0920-0d2a-445c-a167-798726d06c3d" containerName="storage-initializer" Apr 22 18:59:30.417580 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.417533 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aeb0920-0d2a-445c-a167-798726d06c3d" containerName="storage-initializer" Apr 22 18:59:30.417580 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.417560 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9aeb0920-0d2a-445c-a167-798726d06c3d" containerName="main" Apr 22 18:59:30.417580 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.417569 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aeb0920-0d2a-445c-a167-798726d06c3d" containerName="main" Apr 22 18:59:30.417692 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.417681 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9aeb0920-0d2a-445c-a167-798726d06c3d" containerName="main" Apr 22 18:59:30.422881 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.422857 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.426542 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.426479 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 18:59:30.426916 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.426891 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mhhb\"" Apr 22 18:59:30.429502 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.429458 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl"] Apr 22 18:59:30.470441 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.470350 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.470441 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.470407 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-model-cache\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.470441 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.470435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/419f4911-2179-400e-a3b6-49bdc0002542-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.470707 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.470460 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-home\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.470707 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.470476 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-dshm\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.470707 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.470526 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm449\" (UniqueName: \"kubernetes.io/projected/419f4911-2179-400e-a3b6-49bdc0002542-kube-api-access-rm449\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.571937 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.571902 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.572130 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.571954 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-model-cache\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.572130 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.571982 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/419f4911-2179-400e-a3b6-49bdc0002542-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.572130 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.572010 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-home\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.572130 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.572028 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-dshm\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.572130 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.572055 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm449\" (UniqueName: \"kubernetes.io/projected/419f4911-2179-400e-a3b6-49bdc0002542-kube-api-access-rm449\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.572373 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.572323 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.572436 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.572371 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-home\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.572514 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.572460 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-model-cache\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.574464 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.574437 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-dshm\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.574601 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.574566 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/419f4911-2179-400e-a3b6-49bdc0002542-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.579578 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.579553 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm449\" (UniqueName: \"kubernetes.io/projected/419f4911-2179-400e-a3b6-49bdc0002542-kube-api-access-rm449\") pod \"scheduler-ha-replicas-test-kserve-79c76456d4-t6knl\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.632463 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.632423 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd"] Apr 22 18:59:30.636922 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.636468 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.645962 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.639297 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-xjm7q\"" Apr 22 18:59:30.651737 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.651675 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd"] Apr 22 18:59:30.735537 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.735470 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:30.773760 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.773717 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.773923 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.773879 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.773968 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.773925 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.774014 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.773964 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-282gc\" (UniqueName: \"kubernetes.io/projected/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-kube-api-access-282gc\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.774014 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.774000 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.774097 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.774063 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.859805 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.859780 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl"] Apr 22 18:59:30.861608 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:59:30.861567 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod419f4911_2179_400e_a3b6_49bdc0002542.slice/crio-5a420debf40a1bfe6e94a15e8086474f535e1fd0bacce76b1d2c8c0a953e275f WatchSource:0}: Error finding container 5a420debf40a1bfe6e94a15e8086474f535e1fd0bacce76b1d2c8c0a953e275f: Status 404 returned error can't find the container with id 5a420debf40a1bfe6e94a15e8086474f535e1fd0bacce76b1d2c8c0a953e275f Apr 22 18:59:30.863325 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.863308 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:59:30.875038 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.875011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.875141 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.875062 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-282gc\" (UniqueName: \"kubernetes.io/projected/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-kube-api-access-282gc\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.875141 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.875102 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.875141 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.875134 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.875270 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.875204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.875347 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.875328 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.875404 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.875340 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.875453 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.875405 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.875565 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.875546 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.875694 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.875670 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.877883 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.877861 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.882554 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.882525 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-282gc\" (UniqueName: \"kubernetes.io/projected/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-kube-api-access-282gc\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:30.954524 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:30.954465 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 18:59:31.094703 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:31.094663 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd"] Apr 22 18:59:31.097896 ip-10-0-136-53 kubenswrapper[2579]: W0422 18:59:31.097861 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d5b86b1_22e8_41d2_a31f_418fbc311fe2.slice/crio-7ac3d04fe4940cfabed24ede7fc74a790499f2a3f204e4526a0a19a3c887281d WatchSource:0}: Error finding container 7ac3d04fe4940cfabed24ede7fc74a790499f2a3f204e4526a0a19a3c887281d: Status 404 returned error can't find the container with id 7ac3d04fe4940cfabed24ede7fc74a790499f2a3f204e4526a0a19a3c887281d Apr 22 18:59:31.586922 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:31.586875 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" event={"ID":"419f4911-2179-400e-a3b6-49bdc0002542","Type":"ContainerStarted","Data":"42d3a9158e3c3384aaf3f96023fadee4672b9945f9b2349e75d3a5464a877ae5"} Apr 22 18:59:31.587336 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:31.586927 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" event={"ID":"419f4911-2179-400e-a3b6-49bdc0002542","Type":"ContainerStarted","Data":"5a420debf40a1bfe6e94a15e8086474f535e1fd0bacce76b1d2c8c0a953e275f"} Apr 22 18:59:31.588611 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:31.588583 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" event={"ID":"4d5b86b1-22e8-41d2-a31f-418fbc311fe2","Type":"ContainerStarted","Data":"1145853eb4ad50a0bd31706dffd408482d1c1f428f30a9893b89070011a33b05"} Apr 22 18:59:31.588724 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:31.588619 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" event={"ID":"4d5b86b1-22e8-41d2-a31f-418fbc311fe2","Type":"ContainerStarted","Data":"7ac3d04fe4940cfabed24ede7fc74a790499f2a3f204e4526a0a19a3c887281d"} Apr 22 18:59:32.594781 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:32.594742 2579 generic.go:358] "Generic (PLEG): container finished" podID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerID="1145853eb4ad50a0bd31706dffd408482d1c1f428f30a9893b89070011a33b05" exitCode=0 Apr 22 18:59:32.595162 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:32.594813 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" event={"ID":"4d5b86b1-22e8-41d2-a31f-418fbc311fe2","Type":"ContainerDied","Data":"1145853eb4ad50a0bd31706dffd408482d1c1f428f30a9893b89070011a33b05"} Apr 22 18:59:34.608301 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:34.608251 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" event={"ID":"4d5b86b1-22e8-41d2-a31f-418fbc311fe2","Type":"ContainerStarted","Data":"a86d9bb4c892d5b3d6f3c90d383b720f49c4dac33473623e1a38dacf08fd2531"} Apr 22 18:59:36.618843 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:36.618802 2579 generic.go:358] "Generic (PLEG): container finished" podID="419f4911-2179-400e-a3b6-49bdc0002542" containerID="42d3a9158e3c3384aaf3f96023fadee4672b9945f9b2349e75d3a5464a877ae5" exitCode=0 Apr 22 18:59:36.619389 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:36.618855 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" event={"ID":"419f4911-2179-400e-a3b6-49bdc0002542","Type":"ContainerDied","Data":"42d3a9158e3c3384aaf3f96023fadee4672b9945f9b2349e75d3a5464a877ae5"} Apr 22 18:59:37.624995 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:37.624949 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" event={"ID":"419f4911-2179-400e-a3b6-49bdc0002542","Type":"ContainerStarted","Data":"5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2"} Apr 22 18:59:37.641458 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:37.641396 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" podStartSLOduration=7.641377406 podStartE2EDuration="7.641377406s" podCreationTimestamp="2026-04-22 18:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:59:37.639756331 +0000 UTC m=+764.088538661" watchObservedRunningTime="2026-04-22 18:59:37.641377406 +0000 UTC m=+764.090159737" Apr 22 18:59:40.736177 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:40.736135 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:40.736177 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:40.736187 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:40.751940 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:40.751908 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 18:59:41.659251 ip-10-0-136-53 kubenswrapper[2579]: I0422 18:59:41.659215 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 19:00:03.731979 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:03.731940 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" event={"ID":"4d5b86b1-22e8-41d2-a31f-418fbc311fe2","Type":"ContainerStarted","Data":"95e15a3ffde37155f9d8eba5031d1cfaf0c3d97602692c075068c309d8c2d49b"} Apr 22 19:00:03.732513 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:03.732171 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 19:00:03.734819 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:03.734781 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 19:00:03.757147 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:03.757078 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" podStartSLOduration=3.083063518 podStartE2EDuration="33.757056513s" podCreationTimestamp="2026-04-22 18:59:30 +0000 UTC" firstStartedPulling="2026-04-22 18:59:32.596242043 +0000 UTC m=+759.045024357" lastFinishedPulling="2026-04-22 19:00:03.27023503 +0000 UTC m=+789.719017352" observedRunningTime="2026-04-22 19:00:03.755319212 +0000 UTC m=+790.204101543" watchObservedRunningTime="2026-04-22 19:00:03.757056513 +0000 UTC m=+790.205838844" Apr 22 19:00:03.792701 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:03.792666 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl"] Apr 22 19:00:03.793081 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:03.792949 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" podUID="419f4911-2179-400e-a3b6-49bdc0002542" containerName="main" containerID="cri-o://5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2" gracePeriod=30 Apr 22 19:00:03.802148 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:03.802118 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd"] Apr 22 19:00:03.816519 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:03.816450 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:03.816695 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:03.816546 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs podName:4d5b86b1-22e8-41d2-a31f-418fbc311fe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:00:04.316526664 +0000 UTC m=+790.765308979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:04.045991 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.045958 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 19:00:04.119253 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.119212 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/419f4911-2179-400e-a3b6-49bdc0002542-tls-certs\") pod \"419f4911-2179-400e-a3b6-49bdc0002542\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " Apr 22 19:00:04.119418 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.119264 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-model-cache\") pod \"419f4911-2179-400e-a3b6-49bdc0002542\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " Apr 22 19:00:04.119418 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.119383 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm449\" (UniqueName: \"kubernetes.io/projected/419f4911-2179-400e-a3b6-49bdc0002542-kube-api-access-rm449\") pod \"419f4911-2179-400e-a3b6-49bdc0002542\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " Apr 22 19:00:04.119560 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.119442 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-home\") pod \"419f4911-2179-400e-a3b6-49bdc0002542\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " Apr 22 19:00:04.119560 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.119547 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-kserve-provision-location\") pod \"419f4911-2179-400e-a3b6-49bdc0002542\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " Apr 22 19:00:04.119668 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.119579 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-dshm\") pod \"419f4911-2179-400e-a3b6-49bdc0002542\" (UID: \"419f4911-2179-400e-a3b6-49bdc0002542\") " Apr 22 19:00:04.119668 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.119619 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-model-cache" (OuterVolumeSpecName: "model-cache") pod "419f4911-2179-400e-a3b6-49bdc0002542" (UID: "419f4911-2179-400e-a3b6-49bdc0002542"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:04.119771 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.119728 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-home" (OuterVolumeSpecName: "home") pod "419f4911-2179-400e-a3b6-49bdc0002542" (UID: "419f4911-2179-400e-a3b6-49bdc0002542"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:04.119953 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.119924 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-model-cache\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:00:04.119953 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.119947 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-home\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:00:04.121637 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.121606 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/419f4911-2179-400e-a3b6-49bdc0002542-kube-api-access-rm449" (OuterVolumeSpecName: "kube-api-access-rm449") pod "419f4911-2179-400e-a3b6-49bdc0002542" (UID: "419f4911-2179-400e-a3b6-49bdc0002542"). InnerVolumeSpecName "kube-api-access-rm449". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:00:04.121759 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.121700 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-dshm" (OuterVolumeSpecName: "dshm") pod "419f4911-2179-400e-a3b6-49bdc0002542" (UID: "419f4911-2179-400e-a3b6-49bdc0002542"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:04.121889 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.121852 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/419f4911-2179-400e-a3b6-49bdc0002542-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "419f4911-2179-400e-a3b6-49bdc0002542" (UID: "419f4911-2179-400e-a3b6-49bdc0002542"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:00:04.185163 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.185116 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "419f4911-2179-400e-a3b6-49bdc0002542" (UID: "419f4911-2179-400e-a3b6-49bdc0002542"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:04.220844 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.220806 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-kserve-provision-location\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:00:04.220844 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.220838 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/419f4911-2179-400e-a3b6-49bdc0002542-dshm\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:00:04.220844 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.220852 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/419f4911-2179-400e-a3b6-49bdc0002542-tls-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:00:04.221076 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.220861 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rm449\" (UniqueName: \"kubernetes.io/projected/419f4911-2179-400e-a3b6-49bdc0002542-kube-api-access-rm449\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:00:04.321811 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:04.321725 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:04.321811 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:04.321809 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs podName:4d5b86b1-22e8-41d2-a31f-418fbc311fe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:00:05.321792717 +0000 UTC m=+791.770575025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:04.737267 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.737230 2579 generic.go:358] "Generic (PLEG): container finished" podID="419f4911-2179-400e-a3b6-49bdc0002542" containerID="5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2" exitCode=0 Apr 22 19:00:04.737783 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.737315 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" Apr 22 19:00:04.737783 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.737317 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" event={"ID":"419f4911-2179-400e-a3b6-49bdc0002542","Type":"ContainerDied","Data":"5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2"} Apr 22 19:00:04.737783 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.737364 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl" event={"ID":"419f4911-2179-400e-a3b6-49bdc0002542","Type":"ContainerDied","Data":"5a420debf40a1bfe6e94a15e8086474f535e1fd0bacce76b1d2c8c0a953e275f"} Apr 22 19:00:04.737783 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.737385 2579 scope.go:117] "RemoveContainer" containerID="5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2" Apr 22 19:00:04.738008 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.737892 2579 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" secret="" err="secret \"scheduler-ha-replicas-test-epp-sa-dockercfg-xjm7q\" not found" Apr 22 19:00:04.739191 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.739163 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 19:00:04.747879 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.747857 2579 scope.go:117] "RemoveContainer" containerID="42d3a9158e3c3384aaf3f96023fadee4672b9945f9b2349e75d3a5464a877ae5" Apr 22 19:00:04.757625 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.757590 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl"] Apr 22 19:00:04.761592 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.761567 2579 scope.go:117] "RemoveContainer" containerID="5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2" Apr 22 19:00:04.761854 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.761827 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-79c76456d4-t6knl"] Apr 22 19:00:04.761934 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:04.761890 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2\": container with ID starting with 5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2 not found: ID does not exist" containerID="5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2" Apr 22 19:00:04.761934 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.761914 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2"} err="failed to get container status \"5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2\": rpc error: code = NotFound desc = could not find container \"5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2\": container with ID starting with 5fe393b7ae17e858868502bca8e074f68d7f9480e3c1abd4e561b98bdbe8f5e2 not found: ID does not exist" Apr 22 19:00:04.761934 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.761933 2579 scope.go:117] "RemoveContainer" containerID="42d3a9158e3c3384aaf3f96023fadee4672b9945f9b2349e75d3a5464a877ae5" Apr 22 19:00:04.762214 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:04.762196 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d3a9158e3c3384aaf3f96023fadee4672b9945f9b2349e75d3a5464a877ae5\": container with ID starting with 42d3a9158e3c3384aaf3f96023fadee4672b9945f9b2349e75d3a5464a877ae5 not found: ID does not exist" containerID="42d3a9158e3c3384aaf3f96023fadee4672b9945f9b2349e75d3a5464a877ae5" Apr 22 19:00:04.762254 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:04.762221 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d3a9158e3c3384aaf3f96023fadee4672b9945f9b2349e75d3a5464a877ae5"} err="failed to get container status \"42d3a9158e3c3384aaf3f96023fadee4672b9945f9b2349e75d3a5464a877ae5\": rpc error: code = NotFound desc = could not find container \"42d3a9158e3c3384aaf3f96023fadee4672b9945f9b2349e75d3a5464a877ae5\": container with ID starting with 42d3a9158e3c3384aaf3f96023fadee4672b9945f9b2349e75d3a5464a877ae5 not found: ID does not exist" Apr 22 19:00:05.333554 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:05.333512 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:05.333771 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:05.333591 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs podName:4d5b86b1-22e8-41d2-a31f-418fbc311fe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:00:07.333573354 +0000 UTC m=+793.782355661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:05.742089 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:05.742047 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="main" containerID="cri-o://a86d9bb4c892d5b3d6f3c90d383b720f49c4dac33473623e1a38dacf08fd2531" gracePeriod=30 Apr 22 19:00:05.742089 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:05.742070 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="tokenizer" containerID="cri-o://95e15a3ffde37155f9d8eba5031d1cfaf0c3d97602692c075068c309d8c2d49b" gracePeriod=30 Apr 22 19:00:05.743588 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:05.743552 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 19:00:06.164380 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:06.164298 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="419f4911-2179-400e-a3b6-49bdc0002542" path="/var/lib/kubelet/pods/419f4911-2179-400e-a3b6-49bdc0002542/volumes" Apr 22 19:00:06.754431 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:06.754399 2579 generic.go:358] "Generic (PLEG): container finished" podID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerID="a86d9bb4c892d5b3d6f3c90d383b720f49c4dac33473623e1a38dacf08fd2531" exitCode=0 Apr 22 19:00:06.754857 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:06.754476 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" event={"ID":"4d5b86b1-22e8-41d2-a31f-418fbc311fe2","Type":"ContainerDied","Data":"a86d9bb4c892d5b3d6f3c90d383b720f49c4dac33473623e1a38dacf08fd2531"} Apr 22 19:00:07.358793 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:07.358757 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:07.358945 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:07.358838 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs podName:4d5b86b1-22e8-41d2-a31f-418fbc311fe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:00:11.358819588 +0000 UTC m=+797.807601896 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:10.955045 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:10.954955 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 19:00:11.400655 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:11.400613 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:11.400840 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:11.400705 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs podName:4d5b86b1-22e8-41d2-a31f-418fbc311fe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:00:19.400686551 +0000 UTC m=+805.849468867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:15.743606 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:00:15.743564 2579 logging.go:55] [core] [Channel #26 SubChannel #27]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.37:9003", ServerName: "10.134.0.37:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.37:9003: connect: connection refused" Apr 22 19:00:16.743301 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:16.743253 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.37:9003\" within 1s: context deadline exceeded" Apr 22 19:00:19.477762 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:19.477721 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:19.478167 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:19.477825 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs podName:4d5b86b1-22e8-41d2-a31f-418fbc311fe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:00:35.477782594 +0000 UTC m=+821.926564909 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:25.742940 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:00:25.742904 2579 logging.go:55] [core] [Channel #28 SubChannel #29]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.37:9003", ServerName: "10.134.0.37:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.37:9003: connect: connection refused" Apr 22 19:00:26.742786 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:26.742732 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.37:9003\" within 1s: context deadline exceeded" Apr 22 19:00:35.521765 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:35.521729 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:35.522165 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:00:35.521812 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs podName:4d5b86b1-22e8-41d2-a31f-418fbc311fe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:01:07.521795656 +0000 UTC m=+853.970577964 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 19:00:35.742628 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:00:35.742592 2579 logging.go:55] [core] [Channel #30 SubChannel #31]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.37:9003", ServerName: "10.134.0.37:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.37:9003: connect: connection refused" Apr 22 19:00:35.856698 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:35.856669 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd_4d5b86b1-22e8-41d2-a31f-418fbc311fe2/tokenizer/0.log" Apr 22 19:00:35.857346 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:35.857322 2579 generic.go:358] "Generic (PLEG): container finished" podID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerID="95e15a3ffde37155f9d8eba5031d1cfaf0c3d97602692c075068c309d8c2d49b" exitCode=137 Apr 22 19:00:35.857426 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:35.857392 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" event={"ID":"4d5b86b1-22e8-41d2-a31f-418fbc311fe2","Type":"ContainerDied","Data":"95e15a3ffde37155f9d8eba5031d1cfaf0c3d97602692c075068c309d8c2d49b"} Apr 22 19:00:35.950294 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:35.950271 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd_4d5b86b1-22e8-41d2-a31f-418fbc311fe2/tokenizer/0.log" Apr 22 19:00:35.950937 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:35.950920 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 19:00:36.026601 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.026516 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs\") pod \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " Apr 22 19:00:36.026766 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.026616 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-uds\") pod \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " Apr 22 19:00:36.026766 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.026667 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-tmp\") pod \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " Apr 22 19:00:36.026766 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.026691 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-kserve-provision-location\") pod \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " Apr 22 19:00:36.026766 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.026720 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-cache\") pod \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " Apr 22 19:00:36.026766 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.026749 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-282gc\" (UniqueName: \"kubernetes.io/projected/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-kube-api-access-282gc\") pod \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\" (UID: \"4d5b86b1-22e8-41d2-a31f-418fbc311fe2\") " Apr 22 19:00:36.027032 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.026975 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "4d5b86b1-22e8-41d2-a31f-418fbc311fe2" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:36.027032 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.027004 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "4d5b86b1-22e8-41d2-a31f-418fbc311fe2" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:36.027123 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.027043 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "4d5b86b1-22e8-41d2-a31f-418fbc311fe2" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:36.027455 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.027425 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4d5b86b1-22e8-41d2-a31f-418fbc311fe2" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:36.028808 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.028785 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4d5b86b1-22e8-41d2-a31f-418fbc311fe2" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:00:36.028890 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.028863 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-kube-api-access-282gc" (OuterVolumeSpecName: "kube-api-access-282gc") pod "4d5b86b1-22e8-41d2-a31f-418fbc311fe2" (UID: "4d5b86b1-22e8-41d2-a31f-418fbc311fe2"). InnerVolumeSpecName "kube-api-access-282gc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:00:36.128171 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.128126 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tls-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:00:36.128171 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.128162 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-uds\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:00:36.128171 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.128176 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-tmp\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:00:36.128418 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.128205 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-kserve-provision-location\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:00:36.128418 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.128218 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-tokenizer-cache\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:00:36.128418 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.128230 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-282gc\" (UniqueName: \"kubernetes.io/projected/4d5b86b1-22e8-41d2-a31f-418fbc311fe2-kube-api-access-282gc\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:00:36.742784 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.742740 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.37:9003\" within 1s: context deadline exceeded" Apr 22 19:00:36.862386 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.862357 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd_4d5b86b1-22e8-41d2-a31f-418fbc311fe2/tokenizer/0.log" Apr 22 19:00:36.863059 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.863034 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" event={"ID":"4d5b86b1-22e8-41d2-a31f-418fbc311fe2","Type":"ContainerDied","Data":"7ac3d04fe4940cfabed24ede7fc74a790499f2a3f204e4526a0a19a3c887281d"} Apr 22 19:00:36.863169 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.863058 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd" Apr 22 19:00:36.863169 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.863084 2579 scope.go:117] "RemoveContainer" containerID="95e15a3ffde37155f9d8eba5031d1cfaf0c3d97602692c075068c309d8c2d49b" Apr 22 19:00:36.871647 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.871626 2579 scope.go:117] "RemoveContainer" containerID="a86d9bb4c892d5b3d6f3c90d383b720f49c4dac33473623e1a38dacf08fd2531" Apr 22 19:00:36.879520 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.879478 2579 scope.go:117] "RemoveContainer" containerID="1145853eb4ad50a0bd31706dffd408482d1c1f428f30a9893b89070011a33b05" Apr 22 19:00:36.880175 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.880147 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd"] Apr 22 19:00:36.883690 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:36.883664 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5flgkjd"] Apr 22 19:00:38.164380 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:38.164341 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" path="/var/lib/kubelet/pods/4d5b86b1-22e8-41d2-a31f-418fbc311fe2/volumes" Apr 22 19:00:56.511828 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.511788 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg"] Apr 22 19:00:56.512343 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512321 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="419f4911-2179-400e-a3b6-49bdc0002542" containerName="storage-initializer" Apr 22 19:00:56.512343 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512339 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="419f4911-2179-400e-a3b6-49bdc0002542" containerName="storage-initializer" Apr 22 19:00:56.512428 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512351 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="main" Apr 22 19:00:56.512428 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512362 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="main" Apr 22 19:00:56.512428 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512387 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="tokenizer" Apr 22 19:00:56.512428 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512396 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="tokenizer" Apr 22 19:00:56.512428 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512412 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="storage-initializer" Apr 22 19:00:56.512428 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512420 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="storage-initializer" Apr 22 19:00:56.512647 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512431 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="419f4911-2179-400e-a3b6-49bdc0002542" containerName="main" Apr 22 19:00:56.512647 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512439 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="419f4911-2179-400e-a3b6-49bdc0002542" containerName="main" Apr 22 19:00:56.512647 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512551 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="main" Apr 22 19:00:56.512647 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512567 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="419f4911-2179-400e-a3b6-49bdc0002542" containerName="main" Apr 22 19:00:56.512647 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.512576 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d5b86b1-22e8-41d2-a31f-418fbc311fe2" containerName="tokenizer" Apr 22 19:00:56.516251 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.516229 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.518448 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.518425 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-round-trip-kserve-self-signed-certs\"" Apr 22 19:00:56.518609 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.518494 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mhhb\"" Apr 22 19:00:56.523077 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.523050 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg"] Apr 22 19:00:56.613700 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.613652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-model-cache\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.613893 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.613717 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-kserve-provision-location\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.613893 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.613779 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f50492db-5536-4dee-8cfb-9965041b048a-tls-certs\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.613893 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.613836 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-home\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.613893 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.613869 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mts9l\" (UniqueName: \"kubernetes.io/projected/f50492db-5536-4dee-8cfb-9965041b048a-kube-api-access-mts9l\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.614055 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.613912 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-dshm\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.714669 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.714629 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f50492db-5536-4dee-8cfb-9965041b048a-tls-certs\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.714669 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.714675 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-home\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.714933 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.714693 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mts9l\" (UniqueName: \"kubernetes.io/projected/f50492db-5536-4dee-8cfb-9965041b048a-kube-api-access-mts9l\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.714933 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.714727 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-dshm\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.714933 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.714759 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-model-cache\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.714933 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.714789 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-kserve-provision-location\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.715171 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.715122 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-home\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.715171 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.715157 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-kserve-provision-location\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.715246 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.715220 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-model-cache\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.717058 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.717034 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-dshm\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.717230 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.717211 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f50492db-5536-4dee-8cfb-9965041b048a-tls-certs\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.721859 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.721835 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mts9l\" (UniqueName: \"kubernetes.io/projected/f50492db-5536-4dee-8cfb-9965041b048a-kube-api-access-mts9l\") pod \"conv-test-round-trip-kserve-5798b8556-bszwg\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.829844 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.829746 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:00:56.958357 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:56.958323 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg"] Apr 22 19:00:56.959932 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:00:56.959904 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf50492db_5536_4dee_8cfb_9965041b048a.slice/crio-fe6e210453e19562e6d6fcef74cb598d85a611f69bc62e8ef1cf7a8bdf41c5eb WatchSource:0}: Error finding container fe6e210453e19562e6d6fcef74cb598d85a611f69bc62e8ef1cf7a8bdf41c5eb: Status 404 returned error can't find the container with id fe6e210453e19562e6d6fcef74cb598d85a611f69bc62e8ef1cf7a8bdf41c5eb Apr 22 19:00:57.935497 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:57.935456 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" event={"ID":"f50492db-5536-4dee-8cfb-9965041b048a","Type":"ContainerStarted","Data":"51bed2c71df17266f72a2c1a12aeabd57df88a60e74878c88b967107a051dba2"} Apr 22 19:00:57.935884 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:00:57.935516 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" event={"ID":"f50492db-5536-4dee-8cfb-9965041b048a","Type":"ContainerStarted","Data":"fe6e210453e19562e6d6fcef74cb598d85a611f69bc62e8ef1cf7a8bdf41c5eb"} Apr 22 19:01:01.952900 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:01.952861 2579 generic.go:358] "Generic (PLEG): container finished" podID="f50492db-5536-4dee-8cfb-9965041b048a" containerID="51bed2c71df17266f72a2c1a12aeabd57df88a60e74878c88b967107a051dba2" exitCode=0 Apr 22 19:01:01.952900 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:01.952902 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" event={"ID":"f50492db-5536-4dee-8cfb-9965041b048a","Type":"ContainerDied","Data":"51bed2c71df17266f72a2c1a12aeabd57df88a60e74878c88b967107a051dba2"} Apr 22 19:01:06.690956 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.690908 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4"] Apr 22 19:01:06.695708 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.695678 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.698272 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.698247 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-x9rnl\"" Apr 22 19:01:06.698436 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.698306 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 19:01:06.710631 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.710600 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4"] Apr 22 19:01:06.715629 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.715337 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nhx6\" (UniqueName: \"kubernetes.io/projected/d82094a6-c96d-470d-9508-d03f34892812-kube-api-access-8nhx6\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.715629 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.715415 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.715629 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.715461 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.715629 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.715537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.715629 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.715567 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d82094a6-c96d-470d-9508-d03f34892812-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.715918 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.715644 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.816731 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.816691 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nhx6\" (UniqueName: \"kubernetes.io/projected/d82094a6-c96d-470d-9508-d03f34892812-kube-api-access-8nhx6\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.816921 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.816785 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.816921 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.816828 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.816921 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.816865 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.816921 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.816893 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d82094a6-c96d-470d-9508-d03f34892812-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.817133 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.816952 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.817546 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.817517 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.817704 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.817678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.817834 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.817811 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.818245 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.818217 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.820751 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.820707 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d82094a6-c96d-470d-9508-d03f34892812-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:06.824703 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:06.824676 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nhx6\" (UniqueName: \"kubernetes.io/projected/d82094a6-c96d-470d-9508-d03f34892812-kube-api-access-8nhx6\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:07.011221 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:07.011177 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:07.160686 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:07.160519 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4"] Apr 22 19:01:07.164652 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:01:07.164579 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd82094a6_c96d_470d_9508_d03f34892812.slice/crio-8f34295c9b4e670e9b988e80f8d3a1f58dd4fedab4fc8263e8642cd2ca74ee72 WatchSource:0}: Error finding container 8f34295c9b4e670e9b988e80f8d3a1f58dd4fedab4fc8263e8642cd2ca74ee72: Status 404 returned error can't find the container with id 8f34295c9b4e670e9b988e80f8d3a1f58dd4fedab4fc8263e8642cd2ca74ee72 Apr 22 19:01:07.980586 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:07.980548 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" event={"ID":"d82094a6-c96d-470d-9508-d03f34892812","Type":"ContainerStarted","Data":"b5bd12d099748bcbec9bdd17bda743d04d7b4ec94924596d065f5a5605428eaf"} Apr 22 19:01:07.980586 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:07.980588 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" event={"ID":"d82094a6-c96d-470d-9508-d03f34892812","Type":"ContainerStarted","Data":"8f34295c9b4e670e9b988e80f8d3a1f58dd4fedab4fc8263e8642cd2ca74ee72"} Apr 22 19:01:08.986579 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:08.986535 2579 generic.go:358] "Generic (PLEG): container finished" podID="d82094a6-c96d-470d-9508-d03f34892812" containerID="b5bd12d099748bcbec9bdd17bda743d04d7b4ec94924596d065f5a5605428eaf" exitCode=0 Apr 22 19:01:08.987037 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:08.986599 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" event={"ID":"d82094a6-c96d-470d-9508-d03f34892812","Type":"ContainerDied","Data":"b5bd12d099748bcbec9bdd17bda743d04d7b4ec94924596d065f5a5605428eaf"} Apr 22 19:01:09.321724 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:09.320474 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg"] Apr 22 19:01:09.992467 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:09.992425 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" event={"ID":"d82094a6-c96d-470d-9508-d03f34892812","Type":"ContainerStarted","Data":"50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f"} Apr 22 19:01:09.992467 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:09.992466 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" event={"ID":"d82094a6-c96d-470d-9508-d03f34892812","Type":"ContainerStarted","Data":"7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906"} Apr 22 19:01:09.992995 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:09.992557 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:10.011854 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:10.011789 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" podStartSLOduration=4.011770138 podStartE2EDuration="4.011770138s" podCreationTimestamp="2026-04-22 19:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:01:10.009815262 +0000 UTC m=+856.458597604" watchObservedRunningTime="2026-04-22 19:01:10.011770138 +0000 UTC m=+856.460552469" Apr 22 19:01:17.012922 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:17.011970 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:17.013941 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:17.013873 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:17.015533 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:17.015461 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" podUID="d82094a6-c96d-470d-9508-d03f34892812" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.39:8082/healthz\": dial tcp 10.134.0.39:8082: connect: connection refused" Apr 22 19:01:27.013227 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:27.013190 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:27.014773 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:27.014739 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:35.102471 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:35.102428 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" event={"ID":"f50492db-5536-4dee-8cfb-9965041b048a","Type":"ContainerStarted","Data":"6903e7d66f5cb219a836f807c4e51b2485165ede8c69c295996e476eb4ab4943"} Apr 22 19:01:35.102932 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:35.102582 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" podUID="f50492db-5536-4dee-8cfb-9965041b048a" containerName="main" containerID="cri-o://6903e7d66f5cb219a836f807c4e51b2485165ede8c69c295996e476eb4ab4943" gracePeriod=30 Apr 22 19:01:35.120819 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:35.120753 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" podStartSLOduration=6.728887265 podStartE2EDuration="39.120732611s" podCreationTimestamp="2026-04-22 19:00:56 +0000 UTC" firstStartedPulling="2026-04-22 19:01:01.954092269 +0000 UTC m=+848.402874577" lastFinishedPulling="2026-04-22 19:01:34.345937615 +0000 UTC m=+880.794719923" observedRunningTime="2026-04-22 19:01:35.117950259 +0000 UTC m=+881.566732592" watchObservedRunningTime="2026-04-22 19:01:35.120732611 +0000 UTC m=+881.569514945" Apr 22 19:01:36.829895 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:36.829854 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:01:47.073643 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:47.073604 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:01:54.104266 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:54.104241 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 19:01:54.104708 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:01:54.104239 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 19:02:05.213838 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.213807 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-5798b8556-bszwg_f50492db-5536-4dee-8cfb-9965041b048a/main/0.log" Apr 22 19:02:05.214347 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.214132 2579 generic.go:358] "Generic (PLEG): container finished" podID="f50492db-5536-4dee-8cfb-9965041b048a" containerID="6903e7d66f5cb219a836f807c4e51b2485165ede8c69c295996e476eb4ab4943" exitCode=137 Apr 22 19:02:05.214347 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.214205 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" event={"ID":"f50492db-5536-4dee-8cfb-9965041b048a","Type":"ContainerDied","Data":"6903e7d66f5cb219a836f807c4e51b2485165ede8c69c295996e476eb4ab4943"} Apr 22 19:02:05.814327 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.814304 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-5798b8556-bszwg_f50492db-5536-4dee-8cfb-9965041b048a/main/0.log" Apr 22 19:02:05.814654 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.814639 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:02:05.866466 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.866426 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mts9l\" (UniqueName: \"kubernetes.io/projected/f50492db-5536-4dee-8cfb-9965041b048a-kube-api-access-mts9l\") pod \"f50492db-5536-4dee-8cfb-9965041b048a\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " Apr 22 19:02:05.866695 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.866571 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-dshm\") pod \"f50492db-5536-4dee-8cfb-9965041b048a\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " Apr 22 19:02:05.866695 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.866606 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-home\") pod \"f50492db-5536-4dee-8cfb-9965041b048a\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " Apr 22 19:02:05.866695 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.866642 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-kserve-provision-location\") pod \"f50492db-5536-4dee-8cfb-9965041b048a\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " Apr 22 19:02:05.866695 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.866680 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-model-cache\") pod \"f50492db-5536-4dee-8cfb-9965041b048a\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " Apr 22 19:02:05.866902 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.866702 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f50492db-5536-4dee-8cfb-9965041b048a-tls-certs\") pod \"f50492db-5536-4dee-8cfb-9965041b048a\" (UID: \"f50492db-5536-4dee-8cfb-9965041b048a\") " Apr 22 19:02:05.866902 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.866833 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-home" (OuterVolumeSpecName: "home") pod "f50492db-5536-4dee-8cfb-9965041b048a" (UID: "f50492db-5536-4dee-8cfb-9965041b048a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:02:05.867016 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.866933 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-model-cache" (OuterVolumeSpecName: "model-cache") pod "f50492db-5536-4dee-8cfb-9965041b048a" (UID: "f50492db-5536-4dee-8cfb-9965041b048a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:02:05.867067 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.867032 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-home\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:02:05.867067 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.867051 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-model-cache\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:02:05.868771 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.868743 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-dshm" (OuterVolumeSpecName: "dshm") pod "f50492db-5536-4dee-8cfb-9965041b048a" (UID: "f50492db-5536-4dee-8cfb-9965041b048a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:02:05.868771 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.868757 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f50492db-5536-4dee-8cfb-9965041b048a-kube-api-access-mts9l" (OuterVolumeSpecName: "kube-api-access-mts9l") pod "f50492db-5536-4dee-8cfb-9965041b048a" (UID: "f50492db-5536-4dee-8cfb-9965041b048a"). InnerVolumeSpecName "kube-api-access-mts9l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:02:05.868932 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.868841 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f50492db-5536-4dee-8cfb-9965041b048a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f50492db-5536-4dee-8cfb-9965041b048a" (UID: "f50492db-5536-4dee-8cfb-9965041b048a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:02:05.922098 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.922033 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f50492db-5536-4dee-8cfb-9965041b048a" (UID: "f50492db-5536-4dee-8cfb-9965041b048a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:02:05.968367 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.968278 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f50492db-5536-4dee-8cfb-9965041b048a-tls-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:02:05.968367 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.968310 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mts9l\" (UniqueName: \"kubernetes.io/projected/f50492db-5536-4dee-8cfb-9965041b048a-kube-api-access-mts9l\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:02:05.968367 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.968320 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-dshm\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:02:05.968367 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:05.968330 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f50492db-5536-4dee-8cfb-9965041b048a-kserve-provision-location\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:02:06.218671 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:06.218593 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-5798b8556-bszwg_f50492db-5536-4dee-8cfb-9965041b048a/main/0.log" Apr 22 19:02:06.219033 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:06.218957 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" event={"ID":"f50492db-5536-4dee-8cfb-9965041b048a","Type":"ContainerDied","Data":"fe6e210453e19562e6d6fcef74cb598d85a611f69bc62e8ef1cf7a8bdf41c5eb"} Apr 22 19:02:06.219033 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:06.218990 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg" Apr 22 19:02:06.219033 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:06.219003 2579 scope.go:117] "RemoveContainer" containerID="6903e7d66f5cb219a836f807c4e51b2485165ede8c69c295996e476eb4ab4943" Apr 22 19:02:06.227447 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:06.227419 2579 scope.go:117] "RemoveContainer" containerID="51bed2c71df17266f72a2c1a12aeabd57df88a60e74878c88b967107a051dba2" Apr 22 19:02:06.235356 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:06.235325 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg"] Apr 22 19:02:06.240307 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:06.240281 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5798b8556-bszwg"] Apr 22 19:02:08.164451 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:02:08.164328 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f50492db-5536-4dee-8cfb-9965041b048a" path="/var/lib/kubelet/pods/f50492db-5536-4dee-8cfb-9965041b048a/volumes" Apr 22 19:03:27.598268 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:27.598173 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4"] Apr 22 19:03:27.598799 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:27.598626 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" podUID="d82094a6-c96d-470d-9508-d03f34892812" containerName="main" containerID="cri-o://7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906" gracePeriod=30 Apr 22 19:03:27.598864 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:27.598736 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" podUID="d82094a6-c96d-470d-9508-d03f34892812" containerName="tokenizer" containerID="cri-o://50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f" gracePeriod=30 Apr 22 19:03:28.507781 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:28.507743 2579 generic.go:358] "Generic (PLEG): container finished" podID="d82094a6-c96d-470d-9508-d03f34892812" containerID="7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906" exitCode=0 Apr 22 19:03:28.507995 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:28.507825 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" event={"ID":"d82094a6-c96d-470d-9508-d03f34892812","Type":"ContainerDied","Data":"7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906"} Apr 22 19:03:28.946637 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:28.946613 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:03:29.022436 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.022402 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nhx6\" (UniqueName: \"kubernetes.io/projected/d82094a6-c96d-470d-9508-d03f34892812-kube-api-access-8nhx6\") pod \"d82094a6-c96d-470d-9508-d03f34892812\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " Apr 22 19:03:29.022660 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.022452 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d82094a6-c96d-470d-9508-d03f34892812-tls-certs\") pod \"d82094a6-c96d-470d-9508-d03f34892812\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " Apr 22 19:03:29.022660 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.022526 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-kserve-provision-location\") pod \"d82094a6-c96d-470d-9508-d03f34892812\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " Apr 22 19:03:29.022660 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.022571 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-cache\") pod \"d82094a6-c96d-470d-9508-d03f34892812\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " Apr 22 19:03:29.022790 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.022670 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-uds\") pod \"d82094a6-c96d-470d-9508-d03f34892812\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " Apr 22 19:03:29.022790 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.022698 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-tmp\") pod \"d82094a6-c96d-470d-9508-d03f34892812\" (UID: \"d82094a6-c96d-470d-9508-d03f34892812\") " Apr 22 19:03:29.022951 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.022899 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d82094a6-c96d-470d-9508-d03f34892812" (UID: "d82094a6-c96d-470d-9508-d03f34892812"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:29.023078 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.022946 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d82094a6-c96d-470d-9508-d03f34892812" (UID: "d82094a6-c96d-470d-9508-d03f34892812"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:29.023232 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.023204 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d82094a6-c96d-470d-9508-d03f34892812" (UID: "d82094a6-c96d-470d-9508-d03f34892812"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:29.023328 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.023309 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d82094a6-c96d-470d-9508-d03f34892812" (UID: "d82094a6-c96d-470d-9508-d03f34892812"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:29.024664 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.024641 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82094a6-c96d-470d-9508-d03f34892812-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d82094a6-c96d-470d-9508-d03f34892812" (UID: "d82094a6-c96d-470d-9508-d03f34892812"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:03:29.024728 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.024690 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d82094a6-c96d-470d-9508-d03f34892812-kube-api-access-8nhx6" (OuterVolumeSpecName: "kube-api-access-8nhx6") pod "d82094a6-c96d-470d-9508-d03f34892812" (UID: "d82094a6-c96d-470d-9508-d03f34892812"). InnerVolumeSpecName "kube-api-access-8nhx6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:03:29.123996 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.123899 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-kserve-provision-location\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:03:29.123996 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.123931 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-cache\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:03:29.123996 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.123945 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-uds\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:03:29.123996 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.123956 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d82094a6-c96d-470d-9508-d03f34892812-tokenizer-tmp\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:03:29.123996 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.123968 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nhx6\" (UniqueName: \"kubernetes.io/projected/d82094a6-c96d-470d-9508-d03f34892812-kube-api-access-8nhx6\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:03:29.123996 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.123981 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d82094a6-c96d-470d-9508-d03f34892812-tls-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:03:29.512903 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.512867 2579 generic.go:358] "Generic (PLEG): container finished" podID="d82094a6-c96d-470d-9508-d03f34892812" containerID="50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f" exitCode=0 Apr 22 19:03:29.512903 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.512906 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" event={"ID":"d82094a6-c96d-470d-9508-d03f34892812","Type":"ContainerDied","Data":"50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f"} Apr 22 19:03:29.513168 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.512929 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" event={"ID":"d82094a6-c96d-470d-9508-d03f34892812","Type":"ContainerDied","Data":"8f34295c9b4e670e9b988e80f8d3a1f58dd4fedab4fc8263e8642cd2ca74ee72"} Apr 22 19:03:29.513168 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.512944 2579 scope.go:117] "RemoveContainer" containerID="50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f" Apr 22 19:03:29.513168 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.512977 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4" Apr 22 19:03:29.521517 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.521478 2579 scope.go:117] "RemoveContainer" containerID="7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906" Apr 22 19:03:29.531669 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.531647 2579 scope.go:117] "RemoveContainer" containerID="b5bd12d099748bcbec9bdd17bda743d04d7b4ec94924596d065f5a5605428eaf" Apr 22 19:03:29.534450 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.534421 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4"] Apr 22 19:03:29.536329 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.536304 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-4npj4"] Apr 22 19:03:29.540136 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.540120 2579 scope.go:117] "RemoveContainer" containerID="50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f" Apr 22 19:03:29.540432 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:03:29.540413 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f\": container with ID starting with 50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f not found: ID does not exist" containerID="50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f" Apr 22 19:03:29.540504 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.540443 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f"} err="failed to get container status \"50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f\": rpc error: code = NotFound desc = could not find container \"50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f\": container with ID starting with 50ba5028d69731696a6a1700de69968b7aa7204d29d2df5ef9850ccdbfe3209f not found: ID does not exist" Apr 22 19:03:29.540504 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.540465 2579 scope.go:117] "RemoveContainer" containerID="7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906" Apr 22 19:03:29.540724 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:03:29.540703 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906\": container with ID starting with 7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906 not found: ID does not exist" containerID="7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906" Apr 22 19:03:29.540768 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.540731 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906"} err="failed to get container status \"7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906\": rpc error: code = NotFound desc = could not find container \"7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906\": container with ID starting with 7b2e6850de7b383bb54dec86981be320bb2ee41adb965977e95fc9c9eddfa906 not found: ID does not exist" Apr 22 19:03:29.540768 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.540748 2579 scope.go:117] "RemoveContainer" containerID="b5bd12d099748bcbec9bdd17bda743d04d7b4ec94924596d065f5a5605428eaf" Apr 22 19:03:29.540988 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:03:29.540969 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5bd12d099748bcbec9bdd17bda743d04d7b4ec94924596d065f5a5605428eaf\": container with ID starting with b5bd12d099748bcbec9bdd17bda743d04d7b4ec94924596d065f5a5605428eaf not found: ID does not exist" containerID="b5bd12d099748bcbec9bdd17bda743d04d7b4ec94924596d065f5a5605428eaf" Apr 22 19:03:29.541071 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:29.540990 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5bd12d099748bcbec9bdd17bda743d04d7b4ec94924596d065f5a5605428eaf"} err="failed to get container status \"b5bd12d099748bcbec9bdd17bda743d04d7b4ec94924596d065f5a5605428eaf\": rpc error: code = NotFound desc = could not find container \"b5bd12d099748bcbec9bdd17bda743d04d7b4ec94924596d065f5a5605428eaf\": container with ID starting with b5bd12d099748bcbec9bdd17bda743d04d7b4ec94924596d065f5a5605428eaf not found: ID does not exist" Apr 22 19:03:30.164875 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:30.164839 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d82094a6-c96d-470d-9508-d03f34892812" path="/var/lib/kubelet/pods/d82094a6-c96d-470d-9508-d03f34892812/volumes" Apr 22 19:03:52.316117 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316074 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf"] Apr 22 19:03:52.316732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316598 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d82094a6-c96d-470d-9508-d03f34892812" containerName="storage-initializer" Apr 22 19:03:52.316732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316618 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82094a6-c96d-470d-9508-d03f34892812" containerName="storage-initializer" Apr 22 19:03:52.316732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316635 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d82094a6-c96d-470d-9508-d03f34892812" containerName="main" Apr 22 19:03:52.316732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316644 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82094a6-c96d-470d-9508-d03f34892812" containerName="main" Apr 22 19:03:52.316732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316657 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f50492db-5536-4dee-8cfb-9965041b048a" containerName="storage-initializer" Apr 22 19:03:52.316732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316665 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50492db-5536-4dee-8cfb-9965041b048a" containerName="storage-initializer" Apr 22 19:03:52.316732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316686 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d82094a6-c96d-470d-9508-d03f34892812" containerName="tokenizer" Apr 22 19:03:52.316732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316694 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82094a6-c96d-470d-9508-d03f34892812" containerName="tokenizer" Apr 22 19:03:52.316732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316711 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f50492db-5536-4dee-8cfb-9965041b048a" containerName="main" Apr 22 19:03:52.316732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316720 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50492db-5536-4dee-8cfb-9965041b048a" containerName="main" Apr 22 19:03:52.317404 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316791 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d82094a6-c96d-470d-9508-d03f34892812" containerName="tokenizer" Apr 22 19:03:52.317404 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316808 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f50492db-5536-4dee-8cfb-9965041b048a" containerName="main" Apr 22 19:03:52.317404 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.316816 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d82094a6-c96d-470d-9508-d03f34892812" containerName="main" Apr 22 19:03:52.319111 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.319089 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.321190 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.321168 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mhhb\"" Apr 22 19:03:52.321320 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.321302 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 22 19:03:52.328642 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.328612 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf"] Apr 22 19:03:52.444375 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.444331 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk7vq\" (UniqueName: \"kubernetes.io/projected/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-kube-api-access-dk7vq\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.444375 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.444379 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-tls-certs\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.444652 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.444506 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-home\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.444652 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.444537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.444652 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.444562 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-dshm\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.444652 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.444621 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-model-cache\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.545107 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.545066 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-home\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.545290 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.545118 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.545290 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.545144 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-dshm\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.545290 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.545161 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-model-cache\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.545290 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.545190 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk7vq\" (UniqueName: \"kubernetes.io/projected/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-kube-api-access-dk7vq\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.545522 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.545357 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-tls-certs\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.545522 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.545500 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-home\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.545630 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.545608 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.545690 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.545655 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-model-cache\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.547623 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.547597 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-dshm\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.547738 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.547701 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-tls-certs\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.551864 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.551843 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk7vq\" (UniqueName: \"kubernetes.io/projected/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-kube-api-access-dk7vq\") pod \"custom-route-timeout-test-kserve-9c99f6755-vvwsf\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.632212 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.632112 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:03:52.765182 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:52.765150 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf"] Apr 22 19:03:52.767547 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:03:52.767481 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b3ee717_82d5_4e7a_b8c0_5a5e86a9e853.slice/crio-1c0ad28bb278ec9a2d879af14d64a207011ba074e895d25cbe1ba6e1871f2ffb WatchSource:0}: Error finding container 1c0ad28bb278ec9a2d879af14d64a207011ba074e895d25cbe1ba6e1871f2ffb: Status 404 returned error can't find the container with id 1c0ad28bb278ec9a2d879af14d64a207011ba074e895d25cbe1ba6e1871f2ffb Apr 22 19:03:53.600792 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:53.600754 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" event={"ID":"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853","Type":"ContainerStarted","Data":"2ac2b56c64ff7d3790469cc7d1c9280acb869d15b2472e989241cd6ea6a57d0f"} Apr 22 19:03:53.600792 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:53.600795 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" event={"ID":"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853","Type":"ContainerStarted","Data":"1c0ad28bb278ec9a2d879af14d64a207011ba074e895d25cbe1ba6e1871f2ffb"} Apr 22 19:03:57.614838 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:57.614794 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerID="2ac2b56c64ff7d3790469cc7d1c9280acb869d15b2472e989241cd6ea6a57d0f" exitCode=0 Apr 22 19:03:57.615295 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:57.614867 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" event={"ID":"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853","Type":"ContainerDied","Data":"2ac2b56c64ff7d3790469cc7d1c9280acb869d15b2472e989241cd6ea6a57d0f"} Apr 22 19:03:58.619898 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:58.619860 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" event={"ID":"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853","Type":"ContainerStarted","Data":"85cb3cc18afeb2768a8f9fc7087e85c9733d3ba30dc5cd6a06f9b1ad98cb808c"} Apr 22 19:03:58.638244 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:58.638195 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" podStartSLOduration=6.638179233 podStartE2EDuration="6.638179233s" podCreationTimestamp="2026-04-22 19:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:03:58.637165542 +0000 UTC m=+1025.085948405" watchObservedRunningTime="2026-04-22 19:03:58.638179233 +0000 UTC m=+1025.086961563" Apr 22 19:03:59.549646 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.549613 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7"] Apr 22 19:03:59.552958 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.552933 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.555338 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.555314 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-kvm2d\"" Apr 22 19:03:59.555559 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.555478 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 19:03:59.563867 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.563840 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7"] Apr 22 19:03:59.616760 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.616717 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.616955 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.616779 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.616955 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.616858 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.616955 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.616912 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.617106 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.616967 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.617106 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.617011 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h7kv\" (UniqueName: \"kubernetes.io/projected/33b8959e-6c6e-421c-bce9-ad5231cd1b77-kube-api-access-5h7kv\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.718143 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.718091 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.718665 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.718162 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.718665 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.718196 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h7kv\" (UniqueName: \"kubernetes.io/projected/33b8959e-6c6e-421c-bce9-ad5231cd1b77-kube-api-access-5h7kv\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.718665 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.718266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.718665 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.718301 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.718665 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.718387 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.718911 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.718692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.718911 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.718729 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.718911 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.718789 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.718911 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.718831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.720803 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.720777 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.725362 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.725334 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h7kv\" (UniqueName: \"kubernetes.io/projected/33b8959e-6c6e-421c-bce9-ad5231cd1b77-kube-api-access-5h7kv\") pod \"stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.865146 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.865047 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:03:59.995643 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:03:59.995610 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7"] Apr 22 19:03:59.998271 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:03:59.998244 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b8959e_6c6e_421c_bce9_ad5231cd1b77.slice/crio-0513fa65d5d609e257ecbc463627fe05a365491989dce5d90f05fd6e233de5d3 WatchSource:0}: Error finding container 0513fa65d5d609e257ecbc463627fe05a365491989dce5d90f05fd6e233de5d3: Status 404 returned error can't find the container with id 0513fa65d5d609e257ecbc463627fe05a365491989dce5d90f05fd6e233de5d3 Apr 22 19:04:00.628342 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:00.628298 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" event={"ID":"33b8959e-6c6e-421c-bce9-ad5231cd1b77","Type":"ContainerStarted","Data":"1927fd601988753b0d4ce3144618c6119ae3c78ec20cc8b1fa7df0f8560a253a"} Apr 22 19:04:00.628342 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:00.628341 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" event={"ID":"33b8959e-6c6e-421c-bce9-ad5231cd1b77","Type":"ContainerStarted","Data":"0513fa65d5d609e257ecbc463627fe05a365491989dce5d90f05fd6e233de5d3"} Apr 22 19:04:01.633426 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:01.633391 2579 generic.go:358] "Generic (PLEG): container finished" podID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerID="1927fd601988753b0d4ce3144618c6119ae3c78ec20cc8b1fa7df0f8560a253a" exitCode=0 Apr 22 19:04:01.633825 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:01.633471 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" event={"ID":"33b8959e-6c6e-421c-bce9-ad5231cd1b77","Type":"ContainerDied","Data":"1927fd601988753b0d4ce3144618c6119ae3c78ec20cc8b1fa7df0f8560a253a"} Apr 22 19:04:02.632801 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:02.632767 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:04:02.632993 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:02.632847 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:04:02.634501 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:02.634451 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 19:04:02.640197 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:02.640160 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" event={"ID":"33b8959e-6c6e-421c-bce9-ad5231cd1b77","Type":"ContainerStarted","Data":"c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180"} Apr 22 19:04:02.640383 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:02.640206 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" event={"ID":"33b8959e-6c6e-421c-bce9-ad5231cd1b77","Type":"ContainerStarted","Data":"3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514"} Apr 22 19:04:02.640383 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:02.640263 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:04:02.658124 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:02.658061 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" podStartSLOduration=3.658040607 podStartE2EDuration="3.658040607s" podCreationTimestamp="2026-04-22 19:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:04:02.657274008 +0000 UTC m=+1029.106056352" watchObservedRunningTime="2026-04-22 19:04:02.658040607 +0000 UTC m=+1029.106822938" Apr 22 19:04:09.865388 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:09.865345 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:04:09.865388 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:09.865398 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:04:09.868304 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:09.868278 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:04:10.679442 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:10.679404 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:04:12.633022 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:12.632978 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 19:04:22.633012 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:22.632960 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 19:04:31.682901 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:31.682868 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:04:32.633574 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:32.633522 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 19:04:42.633558 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:42.633501 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 19:04:52.633224 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:04:52.633174 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 19:05:02.632738 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:02.632684 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 19:05:12.633733 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:12.633676 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 19:05:22.633217 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:22.633169 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 19:05:32.633207 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:32.633164 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 19:05:41.046388 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:41.046348 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7"] Apr 22 19:05:41.046879 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:41.046826 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" podUID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerName="main" containerID="cri-o://3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514" gracePeriod=30 Apr 22 19:05:41.046981 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:41.046895 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" podUID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerName="tokenizer" containerID="cri-o://c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180" gracePeriod=30 Apr 22 19:05:41.682176 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:05:41.682143 2579 logging.go:55] [core] [Channel #158 SubChannel #159]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.41:9003", ServerName: "10.134.0.41:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.41:9003: connect: connection refused" Apr 22 19:05:42.016441 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.016406 2579 generic.go:358] "Generic (PLEG): container finished" podID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerID="3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514" exitCode=0 Apr 22 19:05:42.016642 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.016477 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" event={"ID":"33b8959e-6c6e-421c-bce9-ad5231cd1b77","Type":"ContainerDied","Data":"3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514"} Apr 22 19:05:42.406038 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.406014 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:05:42.505668 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.505629 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tls-certs\") pod \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " Apr 22 19:05:42.505859 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.505684 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-tmp\") pod \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " Apr 22 19:05:42.505859 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.505727 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-cache\") pod \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " Apr 22 19:05:42.505859 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.505792 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-uds\") pod \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " Apr 22 19:05:42.505859 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.505820 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h7kv\" (UniqueName: \"kubernetes.io/projected/33b8959e-6c6e-421c-bce9-ad5231cd1b77-kube-api-access-5h7kv\") pod \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " Apr 22 19:05:42.506078 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.505872 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-kserve-provision-location\") pod \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\" (UID: \"33b8959e-6c6e-421c-bce9-ad5231cd1b77\") " Apr 22 19:05:42.506078 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.506062 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "33b8959e-6c6e-421c-bce9-ad5231cd1b77" (UID: "33b8959e-6c6e-421c-bce9-ad5231cd1b77"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:05:42.506176 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.506087 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "33b8959e-6c6e-421c-bce9-ad5231cd1b77" (UID: "33b8959e-6c6e-421c-bce9-ad5231cd1b77"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:05:42.506176 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.506099 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "33b8959e-6c6e-421c-bce9-ad5231cd1b77" (UID: "33b8959e-6c6e-421c-bce9-ad5231cd1b77"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:05:42.506334 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.506307 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-uds\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:05:42.506334 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.506333 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-tmp\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:05:42.506334 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.506343 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tokenizer-cache\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:05:42.506734 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.506710 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "33b8959e-6c6e-421c-bce9-ad5231cd1b77" (UID: "33b8959e-6c6e-421c-bce9-ad5231cd1b77"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:05:42.507943 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.507923 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "33b8959e-6c6e-421c-bce9-ad5231cd1b77" (UID: "33b8959e-6c6e-421c-bce9-ad5231cd1b77"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:05:42.507943 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.507924 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b8959e-6c6e-421c-bce9-ad5231cd1b77-kube-api-access-5h7kv" (OuterVolumeSpecName: "kube-api-access-5h7kv") pod "33b8959e-6c6e-421c-bce9-ad5231cd1b77" (UID: "33b8959e-6c6e-421c-bce9-ad5231cd1b77"). InnerVolumeSpecName "kube-api-access-5h7kv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:05:42.607732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.607629 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33b8959e-6c6e-421c-bce9-ad5231cd1b77-tls-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:05:42.607732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.607666 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5h7kv\" (UniqueName: \"kubernetes.io/projected/33b8959e-6c6e-421c-bce9-ad5231cd1b77-kube-api-access-5h7kv\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:05:42.607732 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.607676 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33b8959e-6c6e-421c-bce9-ad5231cd1b77-kserve-provision-location\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:05:42.642469 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.642434 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:05:42.650307 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.650277 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:05:42.682545 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:42.682504 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" podUID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.41:9003\" within 1s: context deadline exceeded" Apr 22 19:05:42.682729 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:05:42.682602 2579 logging.go:55] [core] [Channel #158 SubChannel #159]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.41:9003", ServerName: "10.134.0.41:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.41:9003: operation was canceled" Apr 22 19:05:43.022329 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.022293 2579 generic.go:358] "Generic (PLEG): container finished" podID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerID="c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180" exitCode=0 Apr 22 19:05:43.022564 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.022381 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" event={"ID":"33b8959e-6c6e-421c-bce9-ad5231cd1b77","Type":"ContainerDied","Data":"c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180"} Apr 22 19:05:43.022564 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.022401 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" Apr 22 19:05:43.022564 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.022428 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7" event={"ID":"33b8959e-6c6e-421c-bce9-ad5231cd1b77","Type":"ContainerDied","Data":"0513fa65d5d609e257ecbc463627fe05a365491989dce5d90f05fd6e233de5d3"} Apr 22 19:05:43.022564 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.022450 2579 scope.go:117] "RemoveContainer" containerID="c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180" Apr 22 19:05:43.039192 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.039169 2579 scope.go:117] "RemoveContainer" containerID="3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514" Apr 22 19:05:43.047208 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.047182 2579 scope.go:117] "RemoveContainer" containerID="1927fd601988753b0d4ce3144618c6119ae3c78ec20cc8b1fa7df0f8560a253a" Apr 22 19:05:43.050110 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.050084 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7"] Apr 22 19:05:43.056227 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.056203 2579 scope.go:117] "RemoveContainer" containerID="c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180" Apr 22 19:05:43.056500 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.056463 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-555cbff9b9-mr9t7"] Apr 22 19:05:43.056595 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:05:43.056561 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180\": container with ID starting with c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180 not found: ID does not exist" containerID="c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180" Apr 22 19:05:43.056643 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.056590 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180"} err="failed to get container status \"c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180\": rpc error: code = NotFound desc = could not find container \"c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180\": container with ID starting with c0a2f7e79a9daa55437904da1e54751ed529ebc94783000ada5a458dc397d180 not found: ID does not exist" Apr 22 19:05:43.056643 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.056611 2579 scope.go:117] "RemoveContainer" containerID="3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514" Apr 22 19:05:43.056891 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:05:43.056874 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514\": container with ID starting with 3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514 not found: ID does not exist" containerID="3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514" Apr 22 19:05:43.056933 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.056897 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514"} err="failed to get container status \"3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514\": rpc error: code = NotFound desc = could not find container \"3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514\": container with ID starting with 3c7637d2f2dfb348b47fbc3d1b293291ecdb51078caf54e1d6ad0901b10c6514 not found: ID does not exist" Apr 22 19:05:43.056933 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.056916 2579 scope.go:117] "RemoveContainer" containerID="1927fd601988753b0d4ce3144618c6119ae3c78ec20cc8b1fa7df0f8560a253a" Apr 22 19:05:43.057164 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:05:43.057145 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1927fd601988753b0d4ce3144618c6119ae3c78ec20cc8b1fa7df0f8560a253a\": container with ID starting with 1927fd601988753b0d4ce3144618c6119ae3c78ec20cc8b1fa7df0f8560a253a not found: ID does not exist" containerID="1927fd601988753b0d4ce3144618c6119ae3c78ec20cc8b1fa7df0f8560a253a" Apr 22 19:05:43.057236 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.057167 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1927fd601988753b0d4ce3144618c6119ae3c78ec20cc8b1fa7df0f8560a253a"} err="failed to get container status \"1927fd601988753b0d4ce3144618c6119ae3c78ec20cc8b1fa7df0f8560a253a\": rpc error: code = NotFound desc = could not find container \"1927fd601988753b0d4ce3144618c6119ae3c78ec20cc8b1fa7df0f8560a253a\": container with ID starting with 1927fd601988753b0d4ce3144618c6119ae3c78ec20cc8b1fa7df0f8560a253a not found: ID does not exist" Apr 22 19:05:43.291592 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.291480 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-798bc4876f-2pm79"] Apr 22 19:05:43.291961 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.291941 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerName="main" Apr 22 19:05:43.292050 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.291964 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerName="main" Apr 22 19:05:43.292050 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.291977 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerName="tokenizer" Apr 22 19:05:43.292050 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.291984 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerName="tokenizer" Apr 22 19:05:43.292050 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.292004 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerName="storage-initializer" Apr 22 19:05:43.292050 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.292014 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerName="storage-initializer" Apr 22 19:05:43.292293 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.292122 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerName="main" Apr 22 19:05:43.292293 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.292139 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" containerName="tokenizer" Apr 22 19:05:43.297034 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.297009 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" Apr 22 19:05:43.299261 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.299234 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 19:05:43.299510 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.299305 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-nlb48\"" Apr 22 19:05:43.302187 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.302162 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-798bc4876f-2pm79"] Apr 22 19:05:43.414460 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.414409 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcr5z\" (UniqueName: \"kubernetes.io/projected/75b1e015-8755-43bc-85cf-1cf20445c262-kube-api-access-jcr5z\") pod \"llmisvc-controller-manager-798bc4876f-2pm79\" (UID: \"75b1e015-8755-43bc-85cf-1cf20445c262\") " pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" Apr 22 19:05:43.414874 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.414568 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75b1e015-8755-43bc-85cf-1cf20445c262-cert\") pod \"llmisvc-controller-manager-798bc4876f-2pm79\" (UID: \"75b1e015-8755-43bc-85cf-1cf20445c262\") " pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" Apr 22 19:05:43.515948 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.515905 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcr5z\" (UniqueName: \"kubernetes.io/projected/75b1e015-8755-43bc-85cf-1cf20445c262-kube-api-access-jcr5z\") pod \"llmisvc-controller-manager-798bc4876f-2pm79\" (UID: \"75b1e015-8755-43bc-85cf-1cf20445c262\") " pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" Apr 22 19:05:43.516162 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.516003 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75b1e015-8755-43bc-85cf-1cf20445c262-cert\") pod \"llmisvc-controller-manager-798bc4876f-2pm79\" (UID: \"75b1e015-8755-43bc-85cf-1cf20445c262\") " pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" Apr 22 19:05:43.518336 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.518312 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75b1e015-8755-43bc-85cf-1cf20445c262-cert\") pod \"llmisvc-controller-manager-798bc4876f-2pm79\" (UID: \"75b1e015-8755-43bc-85cf-1cf20445c262\") " pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" Apr 22 19:05:43.522903 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.522872 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcr5z\" (UniqueName: \"kubernetes.io/projected/75b1e015-8755-43bc-85cf-1cf20445c262-kube-api-access-jcr5z\") pod \"llmisvc-controller-manager-798bc4876f-2pm79\" (UID: \"75b1e015-8755-43bc-85cf-1cf20445c262\") " pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" Apr 22 19:05:43.608701 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.608611 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" Apr 22 19:05:43.750728 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.750160 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-798bc4876f-2pm79"] Apr 22 19:05:43.754340 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:05:43.754307 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod75b1e015_8755_43bc_85cf_1cf20445c262.slice/crio-3207c7437076036b3f0ab7d41cc22ee49e5557aee5e8c858bf340fcdb47ee3c2 WatchSource:0}: Error finding container 3207c7437076036b3f0ab7d41cc22ee49e5557aee5e8c858bf340fcdb47ee3c2: Status 404 returned error can't find the container with id 3207c7437076036b3f0ab7d41cc22ee49e5557aee5e8c858bf340fcdb47ee3c2 Apr 22 19:05:43.756106 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:43.756082 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:05:44.027716 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:44.027676 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" event={"ID":"75b1e015-8755-43bc-85cf-1cf20445c262","Type":"ContainerStarted","Data":"3207c7437076036b3f0ab7d41cc22ee49e5557aee5e8c858bf340fcdb47ee3c2"} Apr 22 19:05:44.164403 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:44.164369 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b8959e-6c6e-421c-bce9-ad5231cd1b77" path="/var/lib/kubelet/pods/33b8959e-6c6e-421c-bce9-ad5231cd1b77/volumes" Apr 22 19:05:48.997925 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:48.997887 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf"] Apr 22 19:05:48.998332 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:48.998256 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" containerID="cri-o://85cb3cc18afeb2768a8f9fc7087e85c9733d3ba30dc5cd6a06f9b1ad98cb808c" gracePeriod=30 Apr 22 19:05:49.052593 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:49.052557 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" event={"ID":"75b1e015-8755-43bc-85cf-1cf20445c262","Type":"ContainerStarted","Data":"e7262e827dd62e703fc5451ebce7750a87c7a82f784187ae7ba16c22d0942e00"} Apr 22 19:05:49.052764 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:49.052675 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" Apr 22 19:05:49.068767 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:05:49.068719 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" podStartSLOduration=1.7508835230000002 podStartE2EDuration="6.068704945s" podCreationTimestamp="2026-04-22 19:05:43 +0000 UTC" firstStartedPulling="2026-04-22 19:05:43.756267959 +0000 UTC m=+1130.205050274" lastFinishedPulling="2026-04-22 19:05:48.074089387 +0000 UTC m=+1134.522871696" observedRunningTime="2026-04-22 19:05:49.066805449 +0000 UTC m=+1135.515587781" watchObservedRunningTime="2026-04-22 19:05:49.068704945 +0000 UTC m=+1135.517487275" Apr 22 19:06:19.160688 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.160641 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-9c99f6755-vvwsf_0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853/main/0.log" Apr 22 19:06:19.161175 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.161112 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerID="85cb3cc18afeb2768a8f9fc7087e85c9733d3ba30dc5cd6a06f9b1ad98cb808c" exitCode=137 Apr 22 19:06:19.161293 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.161267 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" event={"ID":"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853","Type":"ContainerDied","Data":"85cb3cc18afeb2768a8f9fc7087e85c9733d3ba30dc5cd6a06f9b1ad98cb808c"} Apr 22 19:06:19.223541 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.223517 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-9c99f6755-vvwsf_0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853/main/0.log" Apr 22 19:06:19.223903 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.223885 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:06:19.336646 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.336554 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-home\") pod \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " Apr 22 19:06:19.336646 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.336621 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-kserve-provision-location\") pod \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " Apr 22 19:06:19.336892 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.336655 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk7vq\" (UniqueName: \"kubernetes.io/projected/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-kube-api-access-dk7vq\") pod \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " Apr 22 19:06:19.336892 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.336678 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-model-cache\") pod \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " Apr 22 19:06:19.336892 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.336741 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-dshm\") pod \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " Apr 22 19:06:19.336892 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.336794 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-tls-certs\") pod \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\" (UID: \"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853\") " Apr 22 19:06:19.337113 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.337023 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-model-cache" (OuterVolumeSpecName: "model-cache") pod "0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" (UID: "0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:19.337113 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.337070 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-home" (OuterVolumeSpecName: "home") pod "0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" (UID: "0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:19.338906 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.338877 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-kube-api-access-dk7vq" (OuterVolumeSpecName: "kube-api-access-dk7vq") pod "0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" (UID: "0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853"). InnerVolumeSpecName "kube-api-access-dk7vq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:19.338906 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.338888 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-dshm" (OuterVolumeSpecName: "dshm") pod "0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" (UID: "0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:19.338906 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.338893 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" (UID: "0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:19.404016 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.403946 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" (UID: "0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:19.437816 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.437782 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-dshm\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:06:19.437816 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.437812 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-tls-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:06:19.437816 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.437822 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-home\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:06:19.437816 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.437832 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-kserve-provision-location\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:06:19.437816 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.437842 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dk7vq\" (UniqueName: \"kubernetes.io/projected/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-kube-api-access-dk7vq\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:06:19.438250 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:19.437852 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853-model-cache\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:06:20.059625 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:20.059595 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-798bc4876f-2pm79" Apr 22 19:06:20.166401 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:20.166369 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-9c99f6755-vvwsf_0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853/main/0.log" Apr 22 19:06:20.166945 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:20.166823 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" event={"ID":"0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853","Type":"ContainerDied","Data":"1c0ad28bb278ec9a2d879af14d64a207011ba074e895d25cbe1ba6e1871f2ffb"} Apr 22 19:06:20.166945 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:20.166846 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf" Apr 22 19:06:20.166945 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:20.166874 2579 scope.go:117] "RemoveContainer" containerID="85cb3cc18afeb2768a8f9fc7087e85c9733d3ba30dc5cd6a06f9b1ad98cb808c" Apr 22 19:06:20.192772 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:20.192738 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf"] Apr 22 19:06:20.193382 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:20.193358 2579 scope.go:117] "RemoveContainer" containerID="2ac2b56c64ff7d3790469cc7d1c9280acb869d15b2472e989241cd6ea6a57d0f" Apr 22 19:06:20.196023 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:20.196001 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-9c99f6755-vvwsf"] Apr 22 19:06:22.165039 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:22.165004 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" path="/var/lib/kubelet/pods/0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853/volumes" Apr 22 19:06:50.178430 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:50.178357 2579 pod_container_manager_linux.go:217] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853] : Timed out while waiting for systemd to remove kubepods-burstable-pod0b3ee717_82d5_4e7a_b8c0_5a5e86a9e853.slice" Apr 22 19:06:54.127019 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:54.126985 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 19:06:54.128470 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:54.128449 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 19:06:58.439881 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.439839 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx"] Apr 22 19:06:58.440450 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.440375 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" Apr 22 19:06:58.440450 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.440391 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" Apr 22 19:06:58.440450 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.440404 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="storage-initializer" Apr 22 19:06:58.440450 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.440412 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="storage-initializer" Apr 22 19:06:58.440709 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.440471 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b3ee717-82d5-4e7a-b8c0-5a5e86a9e853" containerName="main" Apr 22 19:06:58.444236 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.444210 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.446864 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.446841 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-kswr2\"" Apr 22 19:06:58.456449 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.456417 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx"] Apr 22 19:06:58.490867 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.490837 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.491026 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.490883 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.491026 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.490903 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.491026 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.490926 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/76a20a8e-8dd7-4dae-9619-2506028350be-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.491026 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.490950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/76a20a8e-8dd7-4dae-9619-2506028350be-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.491026 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.490974 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8tvj\" (UniqueName: \"kubernetes.io/projected/76a20a8e-8dd7-4dae-9619-2506028350be-kube-api-access-v8tvj\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.491206 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.491066 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.491206 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.491113 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/76a20a8e-8dd7-4dae-9619-2506028350be-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.491206 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.491162 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.592496 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.592443 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.592698 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.592614 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.592698 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.592681 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.592795 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.592709 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.592795 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.592742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/76a20a8e-8dd7-4dae-9619-2506028350be-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.592795 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.592784 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/76a20a8e-8dd7-4dae-9619-2506028350be-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.592997 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.592824 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8tvj\" (UniqueName: \"kubernetes.io/projected/76a20a8e-8dd7-4dae-9619-2506028350be-kube-api-access-v8tvj\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.592997 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.592877 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.592997 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.592931 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/76a20a8e-8dd7-4dae-9619-2506028350be-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.593144 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.592998 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.593144 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.593028 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.593144 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.593079 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.593298 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.593258 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.593721 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.593695 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/76a20a8e-8dd7-4dae-9619-2506028350be-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.595143 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.595117 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/76a20a8e-8dd7-4dae-9619-2506028350be-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.595326 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.595308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/76a20a8e-8dd7-4dae-9619-2506028350be-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.600033 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.600010 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/76a20a8e-8dd7-4dae-9619-2506028350be-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.600172 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.600152 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8tvj\" (UniqueName: \"kubernetes.io/projected/76a20a8e-8dd7-4dae-9619-2506028350be-kube-api-access-v8tvj\") pod \"router-gateway-2-openshift-default-6866b85949-9c8xx\" (UID: \"76a20a8e-8dd7-4dae-9619-2506028350be\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.757758 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.757722 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:58.921988 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.921952 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx"] Apr 22 19:06:58.926623 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:06:58.926583 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76a20a8e_8dd7_4dae_9619_2506028350be.slice/crio-58f123bc721f931314ca2d125c4534900a7e9281e6213fa8242a8215dedadbbc WatchSource:0}: Error finding container 58f123bc721f931314ca2d125c4534900a7e9281e6213fa8242a8215dedadbbc: Status 404 returned error can't find the container with id 58f123bc721f931314ca2d125c4534900a7e9281e6213fa8242a8215dedadbbc Apr 22 19:06:58.929824 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.929787 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 19:06:58.929937 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.929875 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 19:06:58.929937 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:58.929918 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 19:06:59.300981 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:59.300936 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" event={"ID":"76a20a8e-8dd7-4dae-9619-2506028350be","Type":"ContainerStarted","Data":"9ea29f68321dcc8e7c729a7a53c608016309d3c237a66e8a9436bea7953d7f3b"} Apr 22 19:06:59.301178 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:59.301000 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" event={"ID":"76a20a8e-8dd7-4dae-9619-2506028350be","Type":"ContainerStarted","Data":"58f123bc721f931314ca2d125c4534900a7e9281e6213fa8242a8215dedadbbc"} Apr 22 19:06:59.319014 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:59.318677 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" podStartSLOduration=1.318657896 podStartE2EDuration="1.318657896s" podCreationTimestamp="2026-04-22 19:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:59.317994477 +0000 UTC m=+1205.766776807" watchObservedRunningTime="2026-04-22 19:06:59.318657896 +0000 UTC m=+1205.767440227" Apr 22 19:06:59.758119 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:59.758080 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:06:59.763551 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:06:59.763522 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:07:00.304515 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:00.304455 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:07:00.305352 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:00.305329 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9c8xx" Apr 22 19:07:30.996693 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:30.996658 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw"] Apr 22 19:07:30.999387 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:30.999368 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.001660 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.001628 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mhhb\"" Apr 22 19:07:31.001770 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.001658 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 19:07:31.001770 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.001718 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-s9sdj\"" Apr 22 19:07:31.009056 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.009032 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw"] Apr 22 19:07:31.108794 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.108762 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nm4f\" (UniqueName: \"kubernetes.io/projected/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-kube-api-access-4nm4f\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.108794 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.108799 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-model-cache\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.109015 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.108837 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-home\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.109015 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.108931 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-dshm\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.109015 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.108991 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-tls-certs\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.109139 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.109015 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.209690 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.209649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-home\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.209887 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.209731 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-dshm\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.209887 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.209787 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-tls-certs\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.209887 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.209806 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.209887 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.209839 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nm4f\" (UniqueName: \"kubernetes.io/projected/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-kube-api-access-4nm4f\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.209887 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.209862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-model-cache\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.210132 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.210074 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-home\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.210229 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.210200 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.210310 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.210255 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-model-cache\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.212079 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.212048 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-dshm\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.212438 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.212415 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-tls-certs\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.217650 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.217619 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nm4f\" (UniqueName: \"kubernetes.io/projected/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-kube-api-access-4nm4f\") pod \"router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.310785 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.310684 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:31.645925 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:31.645890 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw"] Apr 22 19:07:31.647204 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:07:31.647163 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6afa4d_1fb6_456f_8635_3016bf0a6f8f.slice/crio-06b0953f8e81b87d5bcb1ea8bb032471dae7ceed26424d9ea637500be2716665 WatchSource:0}: Error finding container 06b0953f8e81b87d5bcb1ea8bb032471dae7ceed26424d9ea637500be2716665: Status 404 returned error can't find the container with id 06b0953f8e81b87d5bcb1ea8bb032471dae7ceed26424d9ea637500be2716665 Apr 22 19:07:32.412052 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:32.412020 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" event={"ID":"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f","Type":"ContainerStarted","Data":"06b0953f8e81b87d5bcb1ea8bb032471dae7ceed26424d9ea637500be2716665"} Apr 22 19:07:33.417504 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:33.417455 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" event={"ID":"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f","Type":"ContainerStarted","Data":"251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7"} Apr 22 19:07:33.417900 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:33.417622 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:34.424224 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:34.424179 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" event={"ID":"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f","Type":"ContainerStarted","Data":"df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f"} Apr 22 19:07:38.440647 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:38.440610 2579 generic.go:358] "Generic (PLEG): container finished" podID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerID="df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f" exitCode=0 Apr 22 19:07:38.441034 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:38.440682 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" event={"ID":"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f","Type":"ContainerDied","Data":"df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f"} Apr 22 19:07:39.447315 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:39.447270 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" event={"ID":"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f","Type":"ContainerStarted","Data":"703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a"} Apr 22 19:07:39.466699 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:39.466643 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podStartSLOduration=8.717494337 podStartE2EDuration="9.466607926s" podCreationTimestamp="2026-04-22 19:07:30 +0000 UTC" firstStartedPulling="2026-04-22 19:07:31.649163051 +0000 UTC m=+1238.097945359" lastFinishedPulling="2026-04-22 19:07:32.398276629 +0000 UTC m=+1238.847058948" observedRunningTime="2026-04-22 19:07:39.46436583 +0000 UTC m=+1245.913148158" watchObservedRunningTime="2026-04-22 19:07:39.466607926 +0000 UTC m=+1245.915390256" Apr 22 19:07:41.311580 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:41.311531 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:41.311580 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:41.311585 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:07:41.313128 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:41.313099 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8001/health\": dial tcp 10.134.0.44:8001: connect: connection refused" Apr 22 19:07:51.311966 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:51.311911 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8001/health\": dial tcp 10.134.0.44:8001: connect: connection refused" Apr 22 19:07:51.328688 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:07:51.328657 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:08:01.311654 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:01.311593 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8001/health\": dial tcp 10.134.0.44:8001: connect: connection refused" Apr 22 19:08:04.117230 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.117189 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp"] Apr 22 19:08:04.122334 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.122305 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.124446 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.124416 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-xn5cj\"" Apr 22 19:08:04.124446 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.124439 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 19:08:04.131031 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.130564 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp"] Apr 22 19:08:04.219347 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.219299 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.219347 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.219342 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.219641 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.219380 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.219641 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.219451 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.219641 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.219480 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.219641 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.219594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8qr4\" (UniqueName: \"kubernetes.io/projected/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-kube-api-access-g8qr4\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.320996 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.320934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8qr4\" (UniqueName: \"kubernetes.io/projected/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-kube-api-access-g8qr4\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.321218 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.321072 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.321218 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.321104 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.321218 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.321140 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.321218 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.321200 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.321452 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.321235 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.321610 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.321587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.321673 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.321624 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.321711 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.321680 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.323607 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.323584 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.323919 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.323896 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.329919 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.329881 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8qr4\" (UniqueName: \"kubernetes.io/projected/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-kube-api-access-g8qr4\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.435279 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.435174 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:04.589571 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:04.589543 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp"] Apr 22 19:08:04.592747 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:08:04.592708 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda63ce4d5_6781_4b0b_8c49_39821a6c0eb6.slice/crio-d4d71f14872e4f37bb66c03fdd17eb96140ba83d2f5c0f92f6735cb1b52eb56a WatchSource:0}: Error finding container d4d71f14872e4f37bb66c03fdd17eb96140ba83d2f5c0f92f6735cb1b52eb56a: Status 404 returned error can't find the container with id d4d71f14872e4f37bb66c03fdd17eb96140ba83d2f5c0f92f6735cb1b52eb56a Apr 22 19:08:05.551718 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:05.551678 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" event={"ID":"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6","Type":"ContainerStarted","Data":"b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6"} Apr 22 19:08:05.551718 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:05.551717 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" event={"ID":"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6","Type":"ContainerStarted","Data":"d4d71f14872e4f37bb66c03fdd17eb96140ba83d2f5c0f92f6735cb1b52eb56a"} Apr 22 19:08:05.552169 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:05.551839 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:06.557405 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:06.557369 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" event={"ID":"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6","Type":"ContainerStarted","Data":"84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc"} Apr 22 19:08:10.575336 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:10.575236 2579 generic.go:358] "Generic (PLEG): container finished" podID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerID="84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc" exitCode=0 Apr 22 19:08:10.575336 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:10.575307 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" event={"ID":"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6","Type":"ContainerDied","Data":"84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc"} Apr 22 19:08:11.311256 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:11.311138 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8001/health\": dial tcp 10.134.0.44:8001: connect: connection refused" Apr 22 19:08:11.581304 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:11.581206 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" event={"ID":"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6","Type":"ContainerStarted","Data":"20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def"} Apr 22 19:08:11.600319 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:11.600255 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podStartSLOduration=7.600234039 podStartE2EDuration="7.600234039s" podCreationTimestamp="2026-04-22 19:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:08:11.598323038 +0000 UTC m=+1278.047105369" watchObservedRunningTime="2026-04-22 19:08:11.600234039 +0000 UTC m=+1278.049016370" Apr 22 19:08:14.436097 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:14.436056 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:14.436097 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:14.436102 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:14.437608 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:14.437573 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:08:14.455112 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:14.455078 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:08:21.311372 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:21.311320 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8001/health\": dial tcp 10.134.0.44:8001: connect: connection refused" Apr 22 19:08:24.436752 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:24.436690 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:08:31.311276 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:31.311226 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8001/health\": dial tcp 10.134.0.44:8001: connect: connection refused" Apr 22 19:08:34.435984 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:34.435930 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:08:41.311606 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:41.311543 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8001/health\": dial tcp 10.134.0.44:8001: connect: connection refused" Apr 22 19:08:44.436672 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:44.436619 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:08:51.311566 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:51.311516 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8001/health\": dial tcp 10.134.0.44:8001: connect: connection refused" Apr 22 19:08:54.436198 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:08:54.436148 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:09:01.311464 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:01.311415 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8001/health\": dial tcp 10.134.0.44:8001: connect: connection refused" Apr 22 19:09:04.435910 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:04.435851 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:09:11.312041 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:11.311934 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8001/health\": dial tcp 10.134.0.44:8001: connect: connection refused" Apr 22 19:09:14.435788 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:14.435737 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:09:21.311620 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:21.311565 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8001/health\": dial tcp 10.134.0.44:8001: connect: connection refused" Apr 22 19:09:24.436131 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:24.436070 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:09:31.321051 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:31.321018 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:09:31.338235 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:31.338207 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:09:34.436562 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:34.436519 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:09:44.436245 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:44.436195 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:09:49.306587 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:49.306269 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw"] Apr 22 19:09:49.307801 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:49.307744 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" containerID="cri-o://703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a" gracePeriod=30 Apr 22 19:09:54.437124 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:09:54.437065 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:10:04.445800 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:04.445765 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:10:04.458336 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:04.458308 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:10:19.307541 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.307462 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="llm-d-routing-sidecar" containerID="cri-o://251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7" gracePeriod=2 Apr 22 19:10:19.594005 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.593982 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw_5e6afa4d-1fb6-456f-8635-3016bf0a6f8f/main/0.log" Apr 22 19:10:19.594702 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.594682 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:10:19.672341 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.672308 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-model-cache\") pod \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " Apr 22 19:10:19.672549 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.672355 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-home\") pod \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " Apr 22 19:10:19.672549 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.672388 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-tls-certs\") pod \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " Apr 22 19:10:19.672549 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.672442 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-dshm\") pod \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " Apr 22 19:10:19.672731 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.672477 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-kserve-provision-location\") pod \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " Apr 22 19:10:19.672731 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.672658 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nm4f\" (UniqueName: \"kubernetes.io/projected/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-kube-api-access-4nm4f\") pod \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\" (UID: \"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f\") " Apr 22 19:10:19.672838 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.672773 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-home" (OuterVolumeSpecName: "home") pod "5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" (UID: "5e6afa4d-1fb6-456f-8635-3016bf0a6f8f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:19.673008 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.672987 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-home\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:10:19.673084 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.672985 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-model-cache" (OuterVolumeSpecName: "model-cache") pod "5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" (UID: "5e6afa4d-1fb6-456f-8635-3016bf0a6f8f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:19.674736 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.674700 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-kube-api-access-4nm4f" (OuterVolumeSpecName: "kube-api-access-4nm4f") pod "5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" (UID: "5e6afa4d-1fb6-456f-8635-3016bf0a6f8f"). InnerVolumeSpecName "kube-api-access-4nm4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:10:19.674838 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.674794 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-dshm" (OuterVolumeSpecName: "dshm") pod "5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" (UID: "5e6afa4d-1fb6-456f-8635-3016bf0a6f8f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:19.674838 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.674810 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" (UID: "5e6afa4d-1fb6-456f-8635-3016bf0a6f8f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:19.704875 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.704830 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" (UID: "5e6afa4d-1fb6-456f-8635-3016bf0a6f8f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:19.773641 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.773604 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4nm4f\" (UniqueName: \"kubernetes.io/projected/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-kube-api-access-4nm4f\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:10:19.773641 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.773636 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-model-cache\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:10:19.773641 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.773648 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-tls-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:10:19.773884 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.773661 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-dshm\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:10:19.773884 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:19.773671 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f-kserve-provision-location\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:10:20.076769 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.076730 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw_5e6afa4d-1fb6-456f-8635-3016bf0a6f8f/main/0.log" Apr 22 19:10:20.077402 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.077375 2579 generic.go:358] "Generic (PLEG): container finished" podID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerID="703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a" exitCode=137 Apr 22 19:10:20.077402 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.077396 2579 generic.go:358] "Generic (PLEG): container finished" podID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerID="251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7" exitCode=0 Apr 22 19:10:20.077605 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.077451 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" event={"ID":"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f","Type":"ContainerDied","Data":"703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a"} Apr 22 19:10:20.077605 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.077469 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" Apr 22 19:10:20.077605 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.077504 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" event={"ID":"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f","Type":"ContainerDied","Data":"251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7"} Apr 22 19:10:20.077605 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.077516 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw" event={"ID":"5e6afa4d-1fb6-456f-8635-3016bf0a6f8f","Type":"ContainerDied","Data":"06b0953f8e81b87d5bcb1ea8bb032471dae7ceed26424d9ea637500be2716665"} Apr 22 19:10:20.077605 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.077530 2579 scope.go:117] "RemoveContainer" containerID="703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a" Apr 22 19:10:20.098259 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.098237 2579 scope.go:117] "RemoveContainer" containerID="df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f" Apr 22 19:10:20.104285 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.104246 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw"] Apr 22 19:10:20.106924 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.106895 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7f856d4bc9-8xwqw"] Apr 22 19:10:20.111875 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.111854 2579 scope.go:117] "RemoveContainer" containerID="251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7" Apr 22 19:10:20.120829 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.120804 2579 scope.go:117] "RemoveContainer" containerID="703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a" Apr 22 19:10:20.121145 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:10:20.121129 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a\": container with ID starting with 703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a not found: ID does not exist" containerID="703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a" Apr 22 19:10:20.121211 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.121153 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a"} err="failed to get container status \"703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a\": rpc error: code = NotFound desc = could not find container \"703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a\": container with ID starting with 703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a not found: ID does not exist" Apr 22 19:10:20.121211 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.121172 2579 scope.go:117] "RemoveContainer" containerID="df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f" Apr 22 19:10:20.121416 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:10:20.121402 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f\": container with ID starting with df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f not found: ID does not exist" containerID="df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f" Apr 22 19:10:20.121476 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.121419 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f"} err="failed to get container status \"df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f\": rpc error: code = NotFound desc = could not find container \"df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f\": container with ID starting with df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f not found: ID does not exist" Apr 22 19:10:20.121476 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.121433 2579 scope.go:117] "RemoveContainer" containerID="251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7" Apr 22 19:10:20.121708 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:10:20.121677 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7\": container with ID starting with 251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7 not found: ID does not exist" containerID="251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7" Apr 22 19:10:20.121769 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.121708 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7"} err="failed to get container status \"251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7\": rpc error: code = NotFound desc = could not find container \"251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7\": container with ID starting with 251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7 not found: ID does not exist" Apr 22 19:10:20.121769 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.121730 2579 scope.go:117] "RemoveContainer" containerID="703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a" Apr 22 19:10:20.121997 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.121978 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a"} err="failed to get container status \"703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a\": rpc error: code = NotFound desc = could not find container \"703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a\": container with ID starting with 703eea89bf2ef326e809dc34afaabf36ccb41fc52c7b79848f89ac6b7d6c7c1a not found: ID does not exist" Apr 22 19:10:20.122061 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.121998 2579 scope.go:117] "RemoveContainer" containerID="df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f" Apr 22 19:10:20.122249 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.122231 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f"} err="failed to get container status \"df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f\": rpc error: code = NotFound desc = could not find container \"df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f\": container with ID starting with df94f190d1897547c2edabcc35f19223b24b8c21cafa85c48123371e333ed01f not found: ID does not exist" Apr 22 19:10:20.122313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.122250 2579 scope.go:117] "RemoveContainer" containerID="251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7" Apr 22 19:10:20.122532 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.122500 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7"} err="failed to get container status \"251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7\": rpc error: code = NotFound desc = could not find container \"251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7\": container with ID starting with 251caf874c4830773c532880eb8e21c8eb786399aeaa53c7b1c1c93ff4b60ea7 not found: ID does not exist" Apr 22 19:10:20.165101 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:20.165059 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" path="/var/lib/kubelet/pods/5e6afa4d-1fb6-456f-8635-3016bf0a6f8f/volumes" Apr 22 19:10:25.241675 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:25.241639 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp"] Apr 22 19:10:25.242160 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:25.242129 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" containerID="cri-o://20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def" gracePeriod=30 Apr 22 19:10:29.417556 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.417515 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw"] Apr 22 19:10:29.418003 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.417951 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="llm-d-routing-sidecar" Apr 22 19:10:29.418003 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.417966 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="llm-d-routing-sidecar" Apr 22 19:10:29.418003 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.417980 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" Apr 22 19:10:29.418003 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.417985 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" Apr 22 19:10:29.418237 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.418006 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="storage-initializer" Apr 22 19:10:29.418237 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.418013 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="storage-initializer" Apr 22 19:10:29.418237 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.418065 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="main" Apr 22 19:10:29.418237 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.418075 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e6afa4d-1fb6-456f-8635-3016bf0a6f8f" containerName="llm-d-routing-sidecar" Apr 22 19:10:29.422642 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.422617 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.424964 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.424937 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-krp4j\"" Apr 22 19:10:29.425107 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.425012 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 19:10:29.434028 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.434001 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw"] Apr 22 19:10:29.573702 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.573663 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-dshm\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.573702 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.573706 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-home\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.573971 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.573725 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-model-cache\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.573971 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.573821 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.573971 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.573870 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b9a2218-fce7-4554-91da-13d2bedb87f5-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.573971 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.573960 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gbkw\" (UniqueName: \"kubernetes.io/projected/6b9a2218-fce7-4554-91da-13d2bedb87f5-kube-api-access-8gbkw\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.675355 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.675257 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.675355 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.675323 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b9a2218-fce7-4554-91da-13d2bedb87f5-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.675628 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.675369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gbkw\" (UniqueName: \"kubernetes.io/projected/6b9a2218-fce7-4554-91da-13d2bedb87f5-kube-api-access-8gbkw\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.675628 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.675444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-dshm\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.675628 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.675470 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-home\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.675628 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.675526 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-model-cache\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.675853 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.675781 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.675908 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.675876 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-model-cache\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.676080 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.676005 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-home\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.677890 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.677863 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-dshm\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.678132 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.678114 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b9a2218-fce7-4554-91da-13d2bedb87f5-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.682619 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.682597 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gbkw\" (UniqueName: \"kubernetes.io/projected/6b9a2218-fce7-4554-91da-13d2bedb87f5-kube-api-access-8gbkw\") pod \"custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.736614 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.736576 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:29.875233 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:29.875197 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw"] Apr 22 19:10:29.879065 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:10:29.879031 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b9a2218_fce7_4554_91da_13d2bedb87f5.slice/crio-e4174be2348051df121ca90d95d5d5789379fee120b251d06a9b46ea80eabfae WatchSource:0}: Error finding container e4174be2348051df121ca90d95d5d5789379fee120b251d06a9b46ea80eabfae: Status 404 returned error can't find the container with id e4174be2348051df121ca90d95d5d5789379fee120b251d06a9b46ea80eabfae Apr 22 19:10:30.116984 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:30.116950 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" event={"ID":"6b9a2218-fce7-4554-91da-13d2bedb87f5","Type":"ContainerStarted","Data":"d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00"} Apr 22 19:10:30.116984 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:30.116988 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" event={"ID":"6b9a2218-fce7-4554-91da-13d2bedb87f5","Type":"ContainerStarted","Data":"e4174be2348051df121ca90d95d5d5789379fee120b251d06a9b46ea80eabfae"} Apr 22 19:10:30.117239 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:30.117031 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:31.122726 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:31.122686 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" event={"ID":"6b9a2218-fce7-4554-91da-13d2bedb87f5","Type":"ContainerStarted","Data":"e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876"} Apr 22 19:10:35.145102 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:35.145065 2579 generic.go:358] "Generic (PLEG): container finished" podID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerID="e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876" exitCode=0 Apr 22 19:10:35.145528 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:35.145134 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" event={"ID":"6b9a2218-fce7-4554-91da-13d2bedb87f5","Type":"ContainerDied","Data":"e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876"} Apr 22 19:10:36.150445 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:36.150411 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" event={"ID":"6b9a2218-fce7-4554-91da-13d2bedb87f5","Type":"ContainerStarted","Data":"97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666"} Apr 22 19:10:36.170318 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:36.170260 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podStartSLOduration=7.170242218 podStartE2EDuration="7.170242218s" podCreationTimestamp="2026-04-22 19:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:10:36.167902561 +0000 UTC m=+1422.616684909" watchObservedRunningTime="2026-04-22 19:10:36.170242218 +0000 UTC m=+1422.619024550" Apr 22 19:10:39.736951 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:39.736849 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:39.736951 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:39.736908 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:39.738315 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:39.738277 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 22 19:10:39.749928 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:39.749903 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:10:49.737268 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:49.737218 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 22 19:10:55.243127 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.243081 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="llm-d-routing-sidecar" containerID="cri-o://b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6" gracePeriod=2 Apr 22 19:10:55.532652 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.532620 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp_a63ce4d5-6781-4b0b-8c49-39821a6c0eb6/main/0.log" Apr 22 19:10:55.533351 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.533332 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:10:55.631935 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.631905 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-kserve-provision-location\") pod \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " Apr 22 19:10:55.632119 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.631950 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-dshm\") pod \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " Apr 22 19:10:55.632119 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.631981 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-model-cache\") pod \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " Apr 22 19:10:55.632119 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.632011 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-tls-certs\") pod \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " Apr 22 19:10:55.632119 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.632062 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8qr4\" (UniqueName: \"kubernetes.io/projected/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-kube-api-access-g8qr4\") pod \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " Apr 22 19:10:55.632119 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.632109 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-home\") pod \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\" (UID: \"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6\") " Apr 22 19:10:55.632393 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.632237 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-model-cache" (OuterVolumeSpecName: "model-cache") pod "a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" (UID: "a63ce4d5-6781-4b0b-8c49-39821a6c0eb6"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:55.632576 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.632548 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-home" (OuterVolumeSpecName: "home") pod "a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" (UID: "a63ce4d5-6781-4b0b-8c49-39821a6c0eb6"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:55.632750 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.632729 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-model-cache\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:10:55.632750 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.632752 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-home\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:10:55.634270 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.634249 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-dshm" (OuterVolumeSpecName: "dshm") pod "a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" (UID: "a63ce4d5-6781-4b0b-8c49-39821a6c0eb6"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:55.634451 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.634436 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" (UID: "a63ce4d5-6781-4b0b-8c49-39821a6c0eb6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:55.634754 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.634737 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-kube-api-access-g8qr4" (OuterVolumeSpecName: "kube-api-access-g8qr4") pod "a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" (UID: "a63ce4d5-6781-4b0b-8c49-39821a6c0eb6"). InnerVolumeSpecName "kube-api-access-g8qr4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:10:55.664893 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.664458 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" (UID: "a63ce4d5-6781-4b0b-8c49-39821a6c0eb6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:55.734167 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.734126 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-kserve-provision-location\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:10:55.734167 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.734160 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-dshm\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:10:55.734167 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.734173 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-tls-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:10:55.734410 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:55.734190 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g8qr4\" (UniqueName: \"kubernetes.io/projected/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6-kube-api-access-g8qr4\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:10:56.224049 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.224021 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp_a63ce4d5-6781-4b0b-8c49-39821a6c0eb6/main/0.log" Apr 22 19:10:56.224747 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.224718 2579 generic.go:358] "Generic (PLEG): container finished" podID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerID="20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def" exitCode=137 Apr 22 19:10:56.224747 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.224742 2579 generic.go:358] "Generic (PLEG): container finished" podID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerID="b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6" exitCode=0 Apr 22 19:10:56.224914 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.224806 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" event={"ID":"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6","Type":"ContainerDied","Data":"20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def"} Apr 22 19:10:56.224914 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.224824 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" Apr 22 19:10:56.224914 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.224845 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" event={"ID":"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6","Type":"ContainerDied","Data":"b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6"} Apr 22 19:10:56.224914 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.224856 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp" event={"ID":"a63ce4d5-6781-4b0b-8c49-39821a6c0eb6","Type":"ContainerDied","Data":"d4d71f14872e4f37bb66c03fdd17eb96140ba83d2f5c0f92f6735cb1b52eb56a"} Apr 22 19:10:56.224914 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.224871 2579 scope.go:117] "RemoveContainer" containerID="20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def" Apr 22 19:10:56.241661 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.241627 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp"] Apr 22 19:10:56.246223 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.246005 2579 scope.go:117] "RemoveContainer" containerID="84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc" Apr 22 19:10:56.248054 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.248031 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-866d98475d79zzp"] Apr 22 19:10:56.292067 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.292041 2579 scope.go:117] "RemoveContainer" containerID="b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6" Apr 22 19:10:56.301033 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.301007 2579 scope.go:117] "RemoveContainer" containerID="20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def" Apr 22 19:10:56.301345 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:10:56.301317 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def\": container with ID starting with 20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def not found: ID does not exist" containerID="20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def" Apr 22 19:10:56.301407 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.301355 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def"} err="failed to get container status \"20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def\": rpc error: code = NotFound desc = could not find container \"20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def\": container with ID starting with 20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def not found: ID does not exist" Apr 22 19:10:56.301407 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.301382 2579 scope.go:117] "RemoveContainer" containerID="84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc" Apr 22 19:10:56.301630 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:10:56.301611 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc\": container with ID starting with 84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc not found: ID does not exist" containerID="84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc" Apr 22 19:10:56.301674 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.301634 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc"} err="failed to get container status \"84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc\": rpc error: code = NotFound desc = could not find container \"84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc\": container with ID starting with 84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc not found: ID does not exist" Apr 22 19:10:56.301674 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.301655 2579 scope.go:117] "RemoveContainer" containerID="b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6" Apr 22 19:10:56.301876 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:10:56.301858 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6\": container with ID starting with b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6 not found: ID does not exist" containerID="b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6" Apr 22 19:10:56.301920 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.301882 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6"} err="failed to get container status \"b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6\": rpc error: code = NotFound desc = could not find container \"b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6\": container with ID starting with b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6 not found: ID does not exist" Apr 22 19:10:56.301920 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.301900 2579 scope.go:117] "RemoveContainer" containerID="20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def" Apr 22 19:10:56.302123 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.302106 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def"} err="failed to get container status \"20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def\": rpc error: code = NotFound desc = could not find container \"20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def\": container with ID starting with 20e4c33284f25c5138cb79e0b9da3c3c30a063c7984e4a2086d80e17ab849def not found: ID does not exist" Apr 22 19:10:56.302173 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.302123 2579 scope.go:117] "RemoveContainer" containerID="84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc" Apr 22 19:10:56.302329 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.302312 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc"} err="failed to get container status \"84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc\": rpc error: code = NotFound desc = could not find container \"84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc\": container with ID starting with 84b9c6430676dcd9b1132cbc2b814ab33df746777385825bea2ef94d68b7cadc not found: ID does not exist" Apr 22 19:10:56.302370 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.302330 2579 scope.go:117] "RemoveContainer" containerID="b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6" Apr 22 19:10:56.302575 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:56.302544 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6"} err="failed to get container status \"b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6\": rpc error: code = NotFound desc = could not find container \"b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6\": container with ID starting with b2bc46b6a91db4f517c2a83e63da6665006540a8f1a480775090729cb77f6fa6 not found: ID does not exist" Apr 22 19:10:58.165194 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:58.165159 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" path="/var/lib/kubelet/pods/a63ce4d5-6781-4b0b-8c49-39821a6c0eb6/volumes" Apr 22 19:10:59.737694 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:10:59.737644 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 22 19:11:09.737069 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:11:09.737008 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 22 19:11:19.737663 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:11:19.737612 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 22 19:11:29.737274 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:11:29.737216 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 22 19:11:39.737566 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:11:39.737511 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 22 19:11:49.737563 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:11:49.737478 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 22 19:11:54.150903 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:11:54.150869 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 19:11:54.155347 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:11:54.155325 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 19:11:59.737439 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:11:59.737391 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 22 19:12:09.738043 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:09.737932 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 22 19:12:19.747384 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:19.747352 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:12:19.760616 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:19.760583 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:12:28.382024 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.381986 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:12:28.382534 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.382402 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="llm-d-routing-sidecar" Apr 22 19:12:28.382534 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.382418 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="llm-d-routing-sidecar" Apr 22 19:12:28.382534 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.382432 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="storage-initializer" Apr 22 19:12:28.382534 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.382440 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="storage-initializer" Apr 22 19:12:28.382534 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.382450 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" Apr 22 19:12:28.382534 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.382458 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" Apr 22 19:12:28.382534 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.382533 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="llm-d-routing-sidecar" Apr 22 19:12:28.382860 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.382542 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a63ce4d5-6781-4b0b-8c49-39821a6c0eb6" containerName="main" Apr 22 19:12:28.384964 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.384939 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.387855 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.387823 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-4wcjg\"" Apr 22 19:12:28.388001 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.387825 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 19:12:28.395391 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.395361 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:12:28.539689 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.539654 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.539689 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.539695 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24tt4\" (UniqueName: \"kubernetes.io/projected/88ff248d-fc64-4559-8eb3-16b93beac2cc-kube-api-access-24tt4\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.539924 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.539745 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.539924 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.539781 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.539924 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.539860 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88ff248d-fc64-4559-8eb3-16b93beac2cc-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.539924 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.539888 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.640503 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.640392 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.640503 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.640442 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.640738 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.640543 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88ff248d-fc64-4559-8eb3-16b93beac2cc-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.640738 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.640582 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.640738 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.640649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.640738 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.640679 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24tt4\" (UniqueName: \"kubernetes.io/projected/88ff248d-fc64-4559-8eb3-16b93beac2cc-kube-api-access-24tt4\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.640957 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.640802 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.640957 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.640839 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.641059 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.640961 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.642846 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.642824 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.642973 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.642936 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88ff248d-fc64-4559-8eb3-16b93beac2cc-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.653464 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.653430 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24tt4\" (UniqueName: \"kubernetes.io/projected/88ff248d-fc64-4559-8eb3-16b93beac2cc-kube-api-access-24tt4\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.697647 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.697609 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:28.828592 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.828565 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:12:28.831920 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:12:28.831883 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88ff248d_fc64_4559_8eb3_16b93beac2cc.slice/crio-3d5cd37f2953e34dfb8392fe4ac889297596811b691571793069ef4bbd794531 WatchSource:0}: Error finding container 3d5cd37f2953e34dfb8392fe4ac889297596811b691571793069ef4bbd794531: Status 404 returned error can't find the container with id 3d5cd37f2953e34dfb8392fe4ac889297596811b691571793069ef4bbd794531 Apr 22 19:12:28.834333 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:28.834306 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:12:29.567443 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:29.567402 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"88ff248d-fc64-4559-8eb3-16b93beac2cc","Type":"ContainerStarted","Data":"d70ef613a734c0684200ab4ff6b14b87c62000b9d4e0ba2f5be0c00a79a957f9"} Apr 22 19:12:29.567443 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:29.567443 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"88ff248d-fc64-4559-8eb3-16b93beac2cc","Type":"ContainerStarted","Data":"3d5cd37f2953e34dfb8392fe4ac889297596811b691571793069ef4bbd794531"} Apr 22 19:12:33.583467 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:33.583371 2579 generic.go:358] "Generic (PLEG): container finished" podID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerID="d70ef613a734c0684200ab4ff6b14b87c62000b9d4e0ba2f5be0c00a79a957f9" exitCode=0 Apr 22 19:12:33.583467 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:33.583441 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"88ff248d-fc64-4559-8eb3-16b93beac2cc","Type":"ContainerDied","Data":"d70ef613a734c0684200ab4ff6b14b87c62000b9d4e0ba2f5be0c00a79a957f9"} Apr 22 19:12:34.589646 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:34.589611 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"88ff248d-fc64-4559-8eb3-16b93beac2cc","Type":"ContainerStarted","Data":"dfc5e248e307af793db4ccb6fd552e35d59be912420919f1dce09d39bae6ef71"} Apr 22 19:12:34.607328 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:34.607269 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=6.607248237 podStartE2EDuration="6.607248237s" podCreationTimestamp="2026-04-22 19:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:12:34.605284259 +0000 UTC m=+1541.054066600" watchObservedRunningTime="2026-04-22 19:12:34.607248237 +0000 UTC m=+1541.056030567" Apr 22 19:12:38.698404 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:38.698356 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:38.700018 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:38.699989 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:12:48.698185 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:48.698143 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:12:51.203026 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.202987 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz"] Apr 22 19:12:51.207314 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.207287 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.209450 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.209420 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 22 19:12:51.209606 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.209421 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-cgt62\"" Apr 22 19:12:51.231073 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.230999 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz"] Apr 22 19:12:51.279512 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.279446 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.279694 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.279607 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.279694 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.279647 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plj9t\" (UniqueName: \"kubernetes.io/projected/405b5964-5c71-4f27-924e-d10a4c1ac5f5-kube-api-access-plj9t\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.279694 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.279668 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.279818 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.279743 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.279818 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.279771 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.380932 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.380887 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.381138 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.380946 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.381138 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.380987 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.381138 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.381094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.381138 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.381120 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plj9t\" (UniqueName: \"kubernetes.io/projected/405b5964-5c71-4f27-924e-d10a4c1ac5f5-kube-api-access-plj9t\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.381325 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.381151 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.381499 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.381455 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.381601 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.381568 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.381660 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.381620 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.381660 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.381650 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.383677 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.383650 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.388632 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.388604 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plj9t\" (UniqueName: \"kubernetes.io/projected/405b5964-5c71-4f27-924e-d10a4c1ac5f5-kube-api-access-plj9t\") pod \"scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.525613 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.525577 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:51.663821 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:51.663795 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz"] Apr 22 19:12:51.666403 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:12:51.666374 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod405b5964_5c71_4f27_924e_d10a4c1ac5f5.slice/crio-0218d17479335f47bda5ea9ed71763bac5d00d6183bbb38896064ab2f1eea610 WatchSource:0}: Error finding container 0218d17479335f47bda5ea9ed71763bac5d00d6183bbb38896064ab2f1eea610: Status 404 returned error can't find the container with id 0218d17479335f47bda5ea9ed71763bac5d00d6183bbb38896064ab2f1eea610 Apr 22 19:12:52.663751 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:52.663711 2579 generic.go:358] "Generic (PLEG): container finished" podID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerID="115c0677339824f11370b63b81469c18ec4effadc1e2d992caec3f5872ae0022" exitCode=0 Apr 22 19:12:52.664228 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:52.663786 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" event={"ID":"405b5964-5c71-4f27-924e-d10a4c1ac5f5","Type":"ContainerDied","Data":"115c0677339824f11370b63b81469c18ec4effadc1e2d992caec3f5872ae0022"} Apr 22 19:12:52.664228 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:52.663828 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" event={"ID":"405b5964-5c71-4f27-924e-d10a4c1ac5f5","Type":"ContainerStarted","Data":"0218d17479335f47bda5ea9ed71763bac5d00d6183bbb38896064ab2f1eea610"} Apr 22 19:12:53.207243 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:53.207205 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw"] Apr 22 19:12:53.207724 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:53.207678 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" containerID="cri-o://97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666" gracePeriod=30 Apr 22 19:12:53.669678 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:53.669638 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" event={"ID":"405b5964-5c71-4f27-924e-d10a4c1ac5f5","Type":"ContainerStarted","Data":"073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e"} Apr 22 19:12:53.669678 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:53.669680 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" event={"ID":"405b5964-5c71-4f27-924e-d10a4c1ac5f5","Type":"ContainerStarted","Data":"0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d"} Apr 22 19:12:53.670120 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:53.669745 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:12:53.689037 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:53.688972 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" podStartSLOduration=2.688953748 podStartE2EDuration="2.688953748s" podCreationTimestamp="2026-04-22 19:12:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:12:53.686696955 +0000 UTC m=+1560.135479286" watchObservedRunningTime="2026-04-22 19:12:53.688953748 +0000 UTC m=+1560.137736080" Apr 22 19:12:58.698227 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:58.698187 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:12:58.698762 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:12:58.698640 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:13:01.526134 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:01.526089 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:13:01.526134 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:01.526144 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:13:01.528950 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:01.528913 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:13:01.703525 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:01.703471 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:13:08.698861 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:08.698813 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:13:18.698561 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:18.698503 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:13:22.710860 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:22.710825 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:13:23.207915 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.207861 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="llm-d-routing-sidecar" containerID="cri-o://d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00" gracePeriod=2 Apr 22 19:13:23.485507 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.485470 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw_6b9a2218-fce7-4554-91da-13d2bedb87f5/main/0.log" Apr 22 19:13:23.486149 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.486128 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:13:23.605944 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.605902 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-home\") pod \"6b9a2218-fce7-4554-91da-13d2bedb87f5\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " Apr 22 19:13:23.606145 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.605960 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-kserve-provision-location\") pod \"6b9a2218-fce7-4554-91da-13d2bedb87f5\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " Apr 22 19:13:23.606145 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.606002 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-model-cache\") pod \"6b9a2218-fce7-4554-91da-13d2bedb87f5\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " Apr 22 19:13:23.606145 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.606029 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-dshm\") pod \"6b9a2218-fce7-4554-91da-13d2bedb87f5\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " Apr 22 19:13:23.606145 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.606048 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gbkw\" (UniqueName: \"kubernetes.io/projected/6b9a2218-fce7-4554-91da-13d2bedb87f5-kube-api-access-8gbkw\") pod \"6b9a2218-fce7-4554-91da-13d2bedb87f5\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " Apr 22 19:13:23.606145 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.606104 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b9a2218-fce7-4554-91da-13d2bedb87f5-tls-certs\") pod \"6b9a2218-fce7-4554-91da-13d2bedb87f5\" (UID: \"6b9a2218-fce7-4554-91da-13d2bedb87f5\") " Apr 22 19:13:23.606448 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.606291 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-model-cache" (OuterVolumeSpecName: "model-cache") pod "6b9a2218-fce7-4554-91da-13d2bedb87f5" (UID: "6b9a2218-fce7-4554-91da-13d2bedb87f5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:23.606448 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.606302 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-home" (OuterVolumeSpecName: "home") pod "6b9a2218-fce7-4554-91da-13d2bedb87f5" (UID: "6b9a2218-fce7-4554-91da-13d2bedb87f5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:23.606614 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.606577 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-model-cache\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:13:23.606674 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.606615 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-home\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:13:23.608278 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.608250 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9a2218-fce7-4554-91da-13d2bedb87f5-kube-api-access-8gbkw" (OuterVolumeSpecName: "kube-api-access-8gbkw") pod "6b9a2218-fce7-4554-91da-13d2bedb87f5" (UID: "6b9a2218-fce7-4554-91da-13d2bedb87f5"). InnerVolumeSpecName "kube-api-access-8gbkw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:13:23.608722 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.608697 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-dshm" (OuterVolumeSpecName: "dshm") pod "6b9a2218-fce7-4554-91da-13d2bedb87f5" (UID: "6b9a2218-fce7-4554-91da-13d2bedb87f5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:23.608722 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.608711 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b9a2218-fce7-4554-91da-13d2bedb87f5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6b9a2218-fce7-4554-91da-13d2bedb87f5" (UID: "6b9a2218-fce7-4554-91da-13d2bedb87f5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:13:23.668594 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.668539 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6b9a2218-fce7-4554-91da-13d2bedb87f5" (UID: "6b9a2218-fce7-4554-91da-13d2bedb87f5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:23.708007 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.707964 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b9a2218-fce7-4554-91da-13d2bedb87f5-tls-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:13:23.708007 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.707999 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-kserve-provision-location\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:13:23.708007 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.708011 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2218-fce7-4554-91da-13d2bedb87f5-dshm\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:13:23.708246 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.708024 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8gbkw\" (UniqueName: \"kubernetes.io/projected/6b9a2218-fce7-4554-91da-13d2bedb87f5-kube-api-access-8gbkw\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:13:23.805635 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.805608 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw_6b9a2218-fce7-4554-91da-13d2bedb87f5/main/0.log" Apr 22 19:13:23.807155 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.807117 2579 generic.go:358] "Generic (PLEG): container finished" podID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerID="97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666" exitCode=137 Apr 22 19:13:23.807343 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.807317 2579 generic.go:358] "Generic (PLEG): container finished" podID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerID="d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00" exitCode=0 Apr 22 19:13:23.807470 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.807267 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" Apr 22 19:13:23.807470 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.807254 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" event={"ID":"6b9a2218-fce7-4554-91da-13d2bedb87f5","Type":"ContainerDied","Data":"97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666"} Apr 22 19:13:23.807627 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.807505 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" event={"ID":"6b9a2218-fce7-4554-91da-13d2bedb87f5","Type":"ContainerDied","Data":"d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00"} Apr 22 19:13:23.807627 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.807532 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw" event={"ID":"6b9a2218-fce7-4554-91da-13d2bedb87f5","Type":"ContainerDied","Data":"e4174be2348051df121ca90d95d5d5789379fee120b251d06a9b46ea80eabfae"} Apr 22 19:13:23.807627 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.807514 2579 scope.go:117] "RemoveContainer" containerID="97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666" Apr 22 19:13:23.827104 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.827080 2579 scope.go:117] "RemoveContainer" containerID="e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876" Apr 22 19:13:23.832302 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.832266 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw"] Apr 22 19:13:23.836001 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.835971 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79cbc64cf9-d8hfw"] Apr 22 19:13:23.876734 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.876712 2579 scope.go:117] "RemoveContainer" containerID="d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00" Apr 22 19:13:23.885455 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.885430 2579 scope.go:117] "RemoveContainer" containerID="97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666" Apr 22 19:13:23.885796 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:13:23.885774 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666\": container with ID starting with 97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666 not found: ID does not exist" containerID="97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666" Apr 22 19:13:23.885859 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.885804 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666"} err="failed to get container status \"97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666\": rpc error: code = NotFound desc = could not find container \"97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666\": container with ID starting with 97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666 not found: ID does not exist" Apr 22 19:13:23.885859 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.885827 2579 scope.go:117] "RemoveContainer" containerID="e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876" Apr 22 19:13:23.886063 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:13:23.886049 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876\": container with ID starting with e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876 not found: ID does not exist" containerID="e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876" Apr 22 19:13:23.886107 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.886066 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876"} err="failed to get container status \"e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876\": rpc error: code = NotFound desc = could not find container \"e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876\": container with ID starting with e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876 not found: ID does not exist" Apr 22 19:13:23.886107 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.886078 2579 scope.go:117] "RemoveContainer" containerID="d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00" Apr 22 19:13:23.886359 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:13:23.886339 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00\": container with ID starting with d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00 not found: ID does not exist" containerID="d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00" Apr 22 19:13:23.886415 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.886365 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00"} err="failed to get container status \"d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00\": rpc error: code = NotFound desc = could not find container \"d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00\": container with ID starting with d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00 not found: ID does not exist" Apr 22 19:13:23.886415 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.886382 2579 scope.go:117] "RemoveContainer" containerID="97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666" Apr 22 19:13:23.886637 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.886617 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666"} err="failed to get container status \"97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666\": rpc error: code = NotFound desc = could not find container \"97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666\": container with ID starting with 97675c43e1fb27c4a8e11deaefd1e4088838937a0247f379d094fddfe266c666 not found: ID does not exist" Apr 22 19:13:23.886698 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.886638 2579 scope.go:117] "RemoveContainer" containerID="e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876" Apr 22 19:13:23.886900 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.886884 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876"} err="failed to get container status \"e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876\": rpc error: code = NotFound desc = could not find container \"e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876\": container with ID starting with e9859028c4eda90abfd2ce01fb536f0bb45d28f7523187feb41f9f26afa85876 not found: ID does not exist" Apr 22 19:13:23.886945 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.886901 2579 scope.go:117] "RemoveContainer" containerID="d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00" Apr 22 19:13:23.887112 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:23.887097 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00"} err="failed to get container status \"d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00\": rpc error: code = NotFound desc = could not find container \"d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00\": container with ID starting with d8a048bc69b6711a41174fffce836b19708a838cc2e07a83099ee35fa11a4b00 not found: ID does not exist" Apr 22 19:13:24.010898 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:24.010866 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz"] Apr 22 19:13:24.011313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:24.011284 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" podUID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerName="main" containerID="cri-o://0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d" gracePeriod=30 Apr 22 19:13:24.011709 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:24.011295 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" podUID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerName="tokenizer" containerID="cri-o://073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e" gracePeriod=30 Apr 22 19:13:24.164676 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:24.164635 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" path="/var/lib/kubelet/pods/6b9a2218-fce7-4554-91da-13d2bedb87f5/volumes" Apr 22 19:13:24.815266 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:24.815228 2579 generic.go:358] "Generic (PLEG): container finished" podID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerID="0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d" exitCode=0 Apr 22 19:13:24.815685 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:24.815271 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" event={"ID":"405b5964-5c71-4f27-924e-d10a4c1ac5f5","Type":"ContainerDied","Data":"0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d"} Apr 22 19:13:25.471571 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.471545 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:13:25.526464 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.526356 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plj9t\" (UniqueName: \"kubernetes.io/projected/405b5964-5c71-4f27-924e-d10a4c1ac5f5-kube-api-access-plj9t\") pod \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " Apr 22 19:13:25.526464 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.526410 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-uds\") pod \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " Apr 22 19:13:25.526464 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.526444 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-cache\") pod \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " Apr 22 19:13:25.526780 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.526526 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-kserve-provision-location\") pod \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " Apr 22 19:13:25.526780 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.526675 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-tmp\") pod \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " Apr 22 19:13:25.526780 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.526721 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tls-certs\") pod \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\" (UID: \"405b5964-5c71-4f27-924e-d10a4c1ac5f5\") " Apr 22 19:13:25.526780 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.526745 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "405b5964-5c71-4f27-924e-d10a4c1ac5f5" (UID: "405b5964-5c71-4f27-924e-d10a4c1ac5f5"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:25.527063 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.527040 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-cache\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:13:25.528652 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.528538 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "405b5964-5c71-4f27-924e-d10a4c1ac5f5" (UID: "405b5964-5c71-4f27-924e-d10a4c1ac5f5"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:25.528771 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.528711 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "405b5964-5c71-4f27-924e-d10a4c1ac5f5" (UID: "405b5964-5c71-4f27-924e-d10a4c1ac5f5"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:25.528771 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.528725 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/405b5964-5c71-4f27-924e-d10a4c1ac5f5-kube-api-access-plj9t" (OuterVolumeSpecName: "kube-api-access-plj9t") pod "405b5964-5c71-4f27-924e-d10a4c1ac5f5" (UID: "405b5964-5c71-4f27-924e-d10a4c1ac5f5"). InnerVolumeSpecName "kube-api-access-plj9t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:13:25.528851 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.528800 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "405b5964-5c71-4f27-924e-d10a4c1ac5f5" (UID: "405b5964-5c71-4f27-924e-d10a4c1ac5f5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:13:25.529052 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.529030 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "405b5964-5c71-4f27-924e-d10a4c1ac5f5" (UID: "405b5964-5c71-4f27-924e-d10a4c1ac5f5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:25.627846 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.627809 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-tmp\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:13:25.627846 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.627842 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tls-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:13:25.627846 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.627854 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-plj9t\" (UniqueName: \"kubernetes.io/projected/405b5964-5c71-4f27-924e-d10a4c1ac5f5-kube-api-access-plj9t\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:13:25.628095 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.627864 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-tokenizer-uds\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:13:25.628095 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.627877 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/405b5964-5c71-4f27-924e-d10a4c1ac5f5-kserve-provision-location\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:13:25.820824 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.820732 2579 generic.go:358] "Generic (PLEG): container finished" podID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerID="073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e" exitCode=0 Apr 22 19:13:25.820824 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.820785 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" event={"ID":"405b5964-5c71-4f27-924e-d10a4c1ac5f5","Type":"ContainerDied","Data":"073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e"} Apr 22 19:13:25.820824 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.820819 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" event={"ID":"405b5964-5c71-4f27-924e-d10a4c1ac5f5","Type":"ContainerDied","Data":"0218d17479335f47bda5ea9ed71763bac5d00d6183bbb38896064ab2f1eea610"} Apr 22 19:13:25.821321 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.820842 2579 scope.go:117] "RemoveContainer" containerID="073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e" Apr 22 19:13:25.821321 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.820858 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz" Apr 22 19:13:25.830817 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.830796 2579 scope.go:117] "RemoveContainer" containerID="0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d" Apr 22 19:13:25.840052 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.840025 2579 scope.go:117] "RemoveContainer" containerID="115c0677339824f11370b63b81469c18ec4effadc1e2d992caec3f5872ae0022" Apr 22 19:13:25.842941 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.842913 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz"] Apr 22 19:13:25.847626 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.847598 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-56d9ds4wsz"] Apr 22 19:13:25.849650 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.849629 2579 scope.go:117] "RemoveContainer" containerID="073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e" Apr 22 19:13:25.849946 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:13:25.849928 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e\": container with ID starting with 073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e not found: ID does not exist" containerID="073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e" Apr 22 19:13:25.850004 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.849956 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e"} err="failed to get container status \"073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e\": rpc error: code = NotFound desc = could not find container \"073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e\": container with ID starting with 073a94fd2bb8b49cdede78138ea78248a51d2b5d1d1fb14787ff20eef71fec0e not found: ID does not exist" Apr 22 19:13:25.850004 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.849976 2579 scope.go:117] "RemoveContainer" containerID="0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d" Apr 22 19:13:25.850185 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:13:25.850168 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d\": container with ID starting with 0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d not found: ID does not exist" containerID="0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d" Apr 22 19:13:25.850227 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.850188 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d"} err="failed to get container status \"0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d\": rpc error: code = NotFound desc = could not find container \"0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d\": container with ID starting with 0f3796066d645f3d0a7e7f39a970ff3cb9b5a05822cf1f36a2d7bbe11277a78d not found: ID does not exist" Apr 22 19:13:25.850227 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.850205 2579 scope.go:117] "RemoveContainer" containerID="115c0677339824f11370b63b81469c18ec4effadc1e2d992caec3f5872ae0022" Apr 22 19:13:25.850378 ip-10-0-136-53 kubenswrapper[2579]: E0422 19:13:25.850362 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"115c0677339824f11370b63b81469c18ec4effadc1e2d992caec3f5872ae0022\": container with ID starting with 115c0677339824f11370b63b81469c18ec4effadc1e2d992caec3f5872ae0022 not found: ID does not exist" containerID="115c0677339824f11370b63b81469c18ec4effadc1e2d992caec3f5872ae0022" Apr 22 19:13:25.850420 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:25.850381 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"115c0677339824f11370b63b81469c18ec4effadc1e2d992caec3f5872ae0022"} err="failed to get container status \"115c0677339824f11370b63b81469c18ec4effadc1e2d992caec3f5872ae0022\": rpc error: code = NotFound desc = could not find container \"115c0677339824f11370b63b81469c18ec4effadc1e2d992caec3f5872ae0022\": container with ID starting with 115c0677339824f11370b63b81469c18ec4effadc1e2d992caec3f5872ae0022 not found: ID does not exist" Apr 22 19:13:26.166044 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:26.165966 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" path="/var/lib/kubelet/pods/405b5964-5c71-4f27-924e-d10a4c1ac5f5/volumes" Apr 22 19:13:28.698993 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:28.698946 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:13:38.698280 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:38.698231 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:13:48.698691 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:48.698643 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:13:58.698953 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:13:58.698903 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:14:08.698667 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:08.698625 2579 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:14:18.708405 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:18.708373 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:14:18.716127 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:18.716097 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:14:26.307920 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:26.307871 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:14:26.308853 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:26.308795 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" containerID="cri-o://dfc5e248e307af793db4ccb6fd552e35d59be912420919f1dce09d39bae6ef71" gracePeriod=30 Apr 22 19:14:27.056775 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.056741 2579 generic.go:358] "Generic (PLEG): container finished" podID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerID="dfc5e248e307af793db4ccb6fd552e35d59be912420919f1dce09d39bae6ef71" exitCode=0 Apr 22 19:14:27.056979 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.056805 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"88ff248d-fc64-4559-8eb3-16b93beac2cc","Type":"ContainerDied","Data":"dfc5e248e307af793db4ccb6fd552e35d59be912420919f1dce09d39bae6ef71"} Apr 22 19:14:27.163400 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.163375 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:14:27.331694 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.331651 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88ff248d-fc64-4559-8eb3-16b93beac2cc-tls-certs\") pod \"88ff248d-fc64-4559-8eb3-16b93beac2cc\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " Apr 22 19:14:27.332186 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.331709 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-dshm\") pod \"88ff248d-fc64-4559-8eb3-16b93beac2cc\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " Apr 22 19:14:27.332186 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.331771 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24tt4\" (UniqueName: \"kubernetes.io/projected/88ff248d-fc64-4559-8eb3-16b93beac2cc-kube-api-access-24tt4\") pod \"88ff248d-fc64-4559-8eb3-16b93beac2cc\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " Apr 22 19:14:27.332186 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.331795 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-home\") pod \"88ff248d-fc64-4559-8eb3-16b93beac2cc\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " Apr 22 19:14:27.332186 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.331853 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-model-cache\") pod \"88ff248d-fc64-4559-8eb3-16b93beac2cc\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " Apr 22 19:14:27.332186 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.331884 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-kserve-provision-location\") pod \"88ff248d-fc64-4559-8eb3-16b93beac2cc\" (UID: \"88ff248d-fc64-4559-8eb3-16b93beac2cc\") " Apr 22 19:14:27.332186 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.332167 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-model-cache" (OuterVolumeSpecName: "model-cache") pod "88ff248d-fc64-4559-8eb3-16b93beac2cc" (UID: "88ff248d-fc64-4559-8eb3-16b93beac2cc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:27.332574 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.332274 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-home" (OuterVolumeSpecName: "home") pod "88ff248d-fc64-4559-8eb3-16b93beac2cc" (UID: "88ff248d-fc64-4559-8eb3-16b93beac2cc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:27.333964 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.333935 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ff248d-fc64-4559-8eb3-16b93beac2cc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "88ff248d-fc64-4559-8eb3-16b93beac2cc" (UID: "88ff248d-fc64-4559-8eb3-16b93beac2cc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:14:27.334076 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.334061 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-dshm" (OuterVolumeSpecName: "dshm") pod "88ff248d-fc64-4559-8eb3-16b93beac2cc" (UID: "88ff248d-fc64-4559-8eb3-16b93beac2cc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:27.334240 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.334210 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ff248d-fc64-4559-8eb3-16b93beac2cc-kube-api-access-24tt4" (OuterVolumeSpecName: "kube-api-access-24tt4") pod "88ff248d-fc64-4559-8eb3-16b93beac2cc" (UID: "88ff248d-fc64-4559-8eb3-16b93beac2cc"). InnerVolumeSpecName "kube-api-access-24tt4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:14:27.387453 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.387401 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "88ff248d-fc64-4559-8eb3-16b93beac2cc" (UID: "88ff248d-fc64-4559-8eb3-16b93beac2cc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:27.432937 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.432896 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-dshm\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:14:27.432937 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.432931 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-24tt4\" (UniqueName: \"kubernetes.io/projected/88ff248d-fc64-4559-8eb3-16b93beac2cc-kube-api-access-24tt4\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:14:27.432937 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.432943 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-home\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:14:27.433206 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.432954 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-model-cache\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:14:27.433206 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.432964 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88ff248d-fc64-4559-8eb3-16b93beac2cc-kserve-provision-location\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:14:27.433206 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:27.432972 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88ff248d-fc64-4559-8eb3-16b93beac2cc-tls-certs\") on node \"ip-10-0-136-53.ec2.internal\" DevicePath \"\"" Apr 22 19:14:28.062146 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:28.062107 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"88ff248d-fc64-4559-8eb3-16b93beac2cc","Type":"ContainerDied","Data":"3d5cd37f2953e34dfb8392fe4ac889297596811b691571793069ef4bbd794531"} Apr 22 19:14:28.062352 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:28.062160 2579 scope.go:117] "RemoveContainer" containerID="dfc5e248e307af793db4ccb6fd552e35d59be912420919f1dce09d39bae6ef71" Apr 22 19:14:28.062352 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:28.062118 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:14:28.081553 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:28.081518 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:14:28.085317 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:28.085296 2579 scope.go:117] "RemoveContainer" containerID="d70ef613a734c0684200ab4ff6b14b87c62000b9d4e0ba2f5be0c00a79a957f9" Apr 22 19:14:28.085947 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:28.085921 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:14:28.164960 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:28.164928 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" path="/var/lib/kubelet/pods/88ff248d-fc64-4559-8eb3-16b93beac2cc/volumes" Apr 22 19:14:42.213925 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:42.213893 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:42.237634 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:42.237604 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:43.126064 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:43.126037 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:43.159634 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:43.159595 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:44.055754 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:44.055721 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:44.070071 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:44.070045 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:44.935691 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:44.935661 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:44.949295 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:44.949270 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:45.826754 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:45.826722 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:45.844152 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:45.844113 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:46.716276 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:46.716243 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:46.730454 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:46.730424 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:47.600117 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:47.600076 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:47.613291 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:47.613260 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:48.478911 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:48.478876 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:48.494617 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:48.494583 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:49.352109 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:49.352079 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:49.370237 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:49.370206 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:50.228993 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:50.228963 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:50.246849 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:50.246825 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:51.111507 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:51.111463 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:51.124808 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:51.124784 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:51.984757 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:51.984730 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:52.001963 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:52.001933 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:52.909498 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:52.909456 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:52.924035 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:52.924004 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:53.981039 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:53.981003 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4rw4s_e2e10ab8-e572-41fb-a750-a15455e243af/istio-proxy/0.log" Apr 22 19:14:53.995439 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:53.995409 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9c8xx_76a20a8e-8dd7-4dae-9619-2506028350be/istio-proxy/0.log" Apr 22 19:14:54.939300 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:54.939263 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-7v85r_32410f0c-444a-4b71-837b-09e27a6b6bb9/istio-proxy/0.log" Apr 22 19:14:54.953366 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:54.953340 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-66ccb77bfb-zczmp_7033a011-280c-43ac-a654-1bfbeeac2a87/router/0.log" Apr 22 19:14:55.726327 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:55.726292 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-7v85r_32410f0c-444a-4b71-837b-09e27a6b6bb9/istio-proxy/0.log" Apr 22 19:14:55.741194 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:55.741158 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-66ccb77bfb-zczmp_7033a011-280c-43ac-a654-1bfbeeac2a87/router/0.log" Apr 22 19:14:56.477377 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:56.477336 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-9txrc_8ab037c0-ba08-477b-9b37-f4d50d3e7104/authorino/0.log" Apr 22 19:14:56.502829 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:14:56.502795 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-cwlxp_7962152c-6d85-4334-acaf-14c9a21560ce/manager/0.log" Apr 22 19:15:01.681762 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:01.681730 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-c9qkp_fe6dcc87-ee88-4a3d-a648-de6bce8431df/global-pull-secret-syncer/0.log" Apr 22 19:15:01.765361 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:01.765331 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-j98vd_4ac72934-9084-4fe8-82a7-ad61ce635cb5/konnectivity-agent/0.log" Apr 22 19:15:01.885749 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:01.885718 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-53.ec2.internal_837597ad9723f3b9499e99b045f4e6a9/haproxy/0.log" Apr 22 19:15:06.077395 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:06.077362 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-9txrc_8ab037c0-ba08-477b-9b37-f4d50d3e7104/authorino/0.log" Apr 22 19:15:06.127798 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:06.127712 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-cwlxp_7962152c-6d85-4334-acaf-14c9a21560ce/manager/0.log" Apr 22 19:15:07.434248 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:07.434166 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-84d4bd4d44-fbqmd_2c00d446-bbf6-4aab-a27c-ed0836ea1765/metrics-server/0.log" Apr 22 19:15:07.489430 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:07.489404 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-776x8_1e48e07e-79a3-4925-a19a-b296810004a0/node-exporter/0.log" Apr 22 19:15:07.514414 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:07.514388 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-776x8_1e48e07e-79a3-4925-a19a-b296810004a0/kube-rbac-proxy/0.log" Apr 22 19:15:07.535282 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:07.535255 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-776x8_1e48e07e-79a3-4925-a19a-b296810004a0/init-textfile/0.log" Apr 22 19:15:07.854681 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:07.854656 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c31833ba-2851-45cc-9133-315b38587337/prometheus/0.log" Apr 22 19:15:07.881917 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:07.881894 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c31833ba-2851-45cc-9133-315b38587337/config-reloader/0.log" Apr 22 19:15:07.908334 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:07.908281 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c31833ba-2851-45cc-9133-315b38587337/thanos-sidecar/0.log" Apr 22 19:15:07.932361 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:07.932334 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c31833ba-2851-45cc-9133-315b38587337/kube-rbac-proxy-web/0.log" Apr 22 19:15:07.966152 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:07.966122 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c31833ba-2851-45cc-9133-315b38587337/kube-rbac-proxy/0.log" Apr 22 19:15:07.989010 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:07.988981 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c31833ba-2851-45cc-9133-315b38587337/kube-rbac-proxy-thanos/0.log" Apr 22 19:15:08.012886 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:08.012862 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c31833ba-2851-45cc-9133-315b38587337/init-config-reloader/0.log" Apr 22 19:15:08.120202 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:08.120114 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-854f54fd88-552xv_8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494/telemeter-client/0.log" Apr 22 19:15:08.139885 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:08.139861 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-854f54fd88-552xv_8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494/reload/0.log" Apr 22 19:15:08.159091 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:08.159067 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-854f54fd88-552xv_8b0f7b21-b1b4-4d29-a1f1-54be7d7fa494/kube-rbac-proxy/0.log" Apr 22 19:15:09.418295 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:09.418259 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-dr26w_cc5b6d2f-45de-45a1-9bb3-d90f5940d33b/networking-console-plugin/0.log" Apr 22 19:15:10.363442 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.363411 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-s8zf7_8ce5885b-3d73-4c7a-9648-01b63671ceaa/download-server/0.log" Apr 22 19:15:10.638967 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.638891 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj"] Apr 22 19:15:10.639313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639237 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="llm-d-routing-sidecar" Apr 22 19:15:10.639313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639248 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="llm-d-routing-sidecar" Apr 22 19:15:10.639313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639257 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" Apr 22 19:15:10.639313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639263 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" Apr 22 19:15:10.639313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639272 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="storage-initializer" Apr 22 19:15:10.639313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639278 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="storage-initializer" Apr 22 19:15:10.639313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639285 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerName="tokenizer" Apr 22 19:15:10.639313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639291 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerName="tokenizer" Apr 22 19:15:10.639313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639302 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerName="main" Apr 22 19:15:10.639313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639309 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerName="main" Apr 22 19:15:10.639313 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639316 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerName="storage-initializer" Apr 22 19:15:10.639680 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639322 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerName="storage-initializer" Apr 22 19:15:10.639680 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639331 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" Apr 22 19:15:10.639680 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639336 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" Apr 22 19:15:10.639680 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639344 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="storage-initializer" Apr 22 19:15:10.639680 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639350 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="storage-initializer" Apr 22 19:15:10.639680 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639407 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="llm-d-routing-sidecar" Apr 22 19:15:10.639680 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639415 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b9a2218-fce7-4554-91da-13d2bedb87f5" containerName="main" Apr 22 19:15:10.639680 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639424 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="88ff248d-fc64-4559-8eb3-16b93beac2cc" containerName="main" Apr 22 19:15:10.639680 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639431 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerName="tokenizer" Apr 22 19:15:10.639680 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.639438 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="405b5964-5c71-4f27-924e-d10a4c1ac5f5" containerName="main" Apr 22 19:15:10.643724 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.643633 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.645896 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.645874 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sn5zn\"/\"kube-root-ca.crt\"" Apr 22 19:15:10.646014 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.645882 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-sn5zn\"/\"default-dockercfg-dc758\"" Apr 22 19:15:10.646703 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.646673 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sn5zn\"/\"openshift-service-ca.crt\"" Apr 22 19:15:10.649283 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.649262 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj"] Apr 22 19:15:10.740261 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.740215 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8913a376-e09e-471c-85e5-b241f4333f95-lib-modules\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.740453 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.740276 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8913a376-e09e-471c-85e5-b241f4333f95-sys\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.740453 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.740320 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8913a376-e09e-471c-85e5-b241f4333f95-podres\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.740453 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.740423 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8913a376-e09e-471c-85e5-b241f4333f95-proc\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.740453 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.740444 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpfh8\" (UniqueName: \"kubernetes.io/projected/8913a376-e09e-471c-85e5-b241f4333f95-kube-api-access-tpfh8\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.832542 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.832510 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-k4nph_3ea60cc1-e044-48db-8411-8f2ee6f65e26/volume-data-source-validator/0.log" Apr 22 19:15:10.841207 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.841172 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8913a376-e09e-471c-85e5-b241f4333f95-podres\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.841354 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.841251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8913a376-e09e-471c-85e5-b241f4333f95-proc\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.841354 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.841277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpfh8\" (UniqueName: \"kubernetes.io/projected/8913a376-e09e-471c-85e5-b241f4333f95-kube-api-access-tpfh8\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.841354 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.841344 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8913a376-e09e-471c-85e5-b241f4333f95-lib-modules\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.841516 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.841378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8913a376-e09e-471c-85e5-b241f4333f95-sys\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.841516 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.841383 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8913a376-e09e-471c-85e5-b241f4333f95-proc\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.841516 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.841384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8913a376-e09e-471c-85e5-b241f4333f95-podres\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.841516 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.841454 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8913a376-e09e-471c-85e5-b241f4333f95-sys\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.841786 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.841593 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8913a376-e09e-471c-85e5-b241f4333f95-lib-modules\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.849191 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.849163 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpfh8\" (UniqueName: \"kubernetes.io/projected/8913a376-e09e-471c-85e5-b241f4333f95-kube-api-access-tpfh8\") pod \"perf-node-gather-daemonset-7zwlj\" (UID: \"8913a376-e09e-471c-85e5-b241f4333f95\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:10.954632 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:10.954531 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:11.083816 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:11.083772 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj"] Apr 22 19:15:11.087176 ip-10-0-136-53 kubenswrapper[2579]: W0422 19:15:11.087145 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8913a376_e09e_471c_85e5_b241f4333f95.slice/crio-8f1d135ce5c4b9409848894ccd167fc3271fa301e841bd5b506892c2d5b2bfd0 WatchSource:0}: Error finding container 8f1d135ce5c4b9409848894ccd167fc3271fa301e841bd5b506892c2d5b2bfd0: Status 404 returned error can't find the container with id 8f1d135ce5c4b9409848894ccd167fc3271fa301e841bd5b506892c2d5b2bfd0 Apr 22 19:15:11.216400 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:11.216309 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" event={"ID":"8913a376-e09e-471c-85e5-b241f4333f95","Type":"ContainerStarted","Data":"8f1d135ce5c4b9409848894ccd167fc3271fa301e841bd5b506892c2d5b2bfd0"} Apr 22 19:15:11.607619 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:11.607591 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qd4bd_493591d7-8b3a-41e8-8c8c-40d1e116007e/dns/0.log" Apr 22 19:15:11.624950 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:11.624914 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qd4bd_493591d7-8b3a-41e8-8c8c-40d1e116007e/kube-rbac-proxy/0.log" Apr 22 19:15:11.666323 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:11.666290 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d76kb_96372a97-6aae-4e84-bcb3-d4446e5315ed/dns-node-resolver/0.log" Apr 22 19:15:12.102456 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:12.102420 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-56f5ddf4f8-ww8sg_b93cb4a4-7991-414d-9679-eaf8caa890ce/registry/0.log" Apr 22 19:15:12.141877 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:12.141842 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-whszt_d751b5d6-3a64-4cdd-addf-8bc01e1ef230/node-ca/0.log" Apr 22 19:15:12.220892 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:12.220850 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" event={"ID":"8913a376-e09e-471c-85e5-b241f4333f95","Type":"ContainerStarted","Data":"10a239998582269ee73f7f3956b790b486296a8ec65be6aea778291b4a984c1f"} Apr 22 19:15:12.221100 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:12.220942 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:12.236726 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:12.236668 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" podStartSLOduration=2.236650952 podStartE2EDuration="2.236650952s" podCreationTimestamp="2026-04-22 19:15:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:15:12.235324831 +0000 UTC m=+1698.684107165" watchObservedRunningTime="2026-04-22 19:15:12.236650952 +0000 UTC m=+1698.685433288" Apr 22 19:15:12.932568 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:12.932531 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-7v85r_32410f0c-444a-4b71-837b-09e27a6b6bb9/istio-proxy/0.log" Apr 22 19:15:12.953885 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:12.953843 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-66ccb77bfb-zczmp_7033a011-280c-43ac-a654-1bfbeeac2a87/router/0.log" Apr 22 19:15:13.369146 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:13.369086 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2qtxz_3d74ab93-b985-4a88-882c-10c6a352ac61/serve-healthcheck-canary/0.log" Apr 22 19:15:13.880572 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:13.880542 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bvh7k_bc3d3928-7592-4441-a0c2-d7041e583080/kube-rbac-proxy/0.log" Apr 22 19:15:13.899302 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:13.899270 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bvh7k_bc3d3928-7592-4441-a0c2-d7041e583080/exporter/0.log" Apr 22 19:15:13.917628 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:13.917597 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bvh7k_bc3d3928-7592-4441-a0c2-d7041e583080/extractor/0.log" Apr 22 19:15:16.473631 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:16.473603 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6869d4659b-496fv_54d5d20a-93bc-49e9-a149-9bb6ff0d511d/manager/0.log" Apr 22 19:15:16.997266 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:16.997238 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6f655776dd-8k9k2_d7d33c80-4ec6-4055-8a15-13659877acea/manager/0.log" Apr 22 19:15:17.045060 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:17.045031 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-798bc4876f-2pm79_75b1e015-8755-43bc-85cf-1cf20445c262/manager/0.log" Apr 22 19:15:17.255181 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:17.255098 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-7dnmq_5473aab8-dcb2-4ca5-975f-b326a6d109e0/manager/0.log" Apr 22 19:15:17.273677 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:17.273646 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-cxc57_91a0099e-2e5f-4851-8224-cb91f9501782/s3-init/0.log" Apr 22 19:15:17.306616 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:17.306576 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-mwktm_f7cb84e7-925f-4bc7-91bf-925046078dcc/seaweedfs/0.log" Apr 22 19:15:18.236610 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:18.236578 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-7zwlj" Apr 22 19:15:22.042372 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:22.042335 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-x6vrp_a74fd54d-3421-40fe-967e-0f3839d3664e/kube-storage-version-migrator-operator/1.log" Apr 22 19:15:22.044693 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:22.044667 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-x6vrp_a74fd54d-3421-40fe-967e-0f3839d3664e/kube-storage-version-migrator-operator/0.log" Apr 22 19:15:23.089566 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:23.089534 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6nbf_b919031c-5663-434c-ad1b-7f33fd17b752/kube-multus-additional-cni-plugins/0.log" Apr 22 19:15:23.108412 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:23.108380 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6nbf_b919031c-5663-434c-ad1b-7f33fd17b752/egress-router-binary-copy/0.log" Apr 22 19:15:23.129895 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:23.129865 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6nbf_b919031c-5663-434c-ad1b-7f33fd17b752/cni-plugins/0.log" Apr 22 19:15:23.146673 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:23.146638 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6nbf_b919031c-5663-434c-ad1b-7f33fd17b752/bond-cni-plugin/0.log" Apr 22 19:15:23.164766 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:23.164733 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6nbf_b919031c-5663-434c-ad1b-7f33fd17b752/routeoverride-cni/0.log" Apr 22 19:15:23.183676 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:23.183651 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6nbf_b919031c-5663-434c-ad1b-7f33fd17b752/whereabouts-cni-bincopy/0.log" Apr 22 19:15:23.201633 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:23.201609 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6nbf_b919031c-5663-434c-ad1b-7f33fd17b752/whereabouts-cni/0.log" Apr 22 19:15:23.452617 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:23.452537 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ssc8m_84618b00-ccd9-4eef-b5aa-c79f222e29a9/kube-multus/0.log" Apr 22 19:15:23.602843 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:23.602814 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z69xm_89068fbb-3d09-49a2-a369-e83cb48aaf0f/network-metrics-daemon/0.log" Apr 22 19:15:23.620017 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:23.619954 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z69xm_89068fbb-3d09-49a2-a369-e83cb48aaf0f/kube-rbac-proxy/0.log" Apr 22 19:15:25.136298 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:25.136268 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-controller/0.log" Apr 22 19:15:25.152883 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:25.152846 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/0.log" Apr 22 19:15:25.160910 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:25.160877 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovn-acl-logging/1.log" Apr 22 19:15:25.187224 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:25.187197 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/kube-rbac-proxy-node/0.log" Apr 22 19:15:25.205322 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:25.205297 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:15:25.225157 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:25.225119 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/northd/0.log" Apr 22 19:15:25.242714 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:25.242687 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/nbdb/0.log" Apr 22 19:15:25.261777 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:25.261745 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/sbdb/0.log" Apr 22 19:15:25.370748 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:25.370716 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tcrqv_e741af54-1487-45cb-b5ea-22577783221e/ovnkube-controller/0.log" Apr 22 19:15:26.366516 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:26.366458 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-fsnhs_a85a3a29-77c8-4cb9-b029-9f13cfa79162/network-check-target-container/0.log" Apr 22 19:15:27.307992 ip-10-0-136-53 kubenswrapper[2579]: I0422 19:15:27.307964 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-9lgpr_e9c820a3-1c2e-4d37-bfc4-d18d772642d3/iptables-alerter/0.log"