Apr 22 18:35:04.532124 ip-10-0-132-151 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:35:04.532138 ip-10-0-132-151 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:35:04.532149 ip-10-0-132-151 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:35:04.532526 ip-10-0-132-151 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:35:14.742559 ip-10-0-132-151 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:35:14.742575 ip-10-0-132-151 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6d8c03757aa04db297ea99fbbb2892db -- Apr 22 18:37:50.905798 ip-10-0-132-151 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:37:51.292476 ip-10-0-132-151 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:37:51.292476 ip-10-0-132-151 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:37:51.292476 ip-10-0-132-151 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:37:51.292476 ip-10-0-132-151 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:37:51.292476 ip-10-0-132-151 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:37:51.293268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.293179 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:37:51.295543 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295527 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:51.295543 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295542 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295547 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295552 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295556 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295560 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295563 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295566 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295569 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295573 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295576 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295578 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295581 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295584 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295587 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295589 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295592 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295594 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295597 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295600 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:51.295608 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295602 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295605 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295609 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295614 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295617 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295621 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295624 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295627 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295630 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295633 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295635 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295638 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295640 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295643 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295645 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295648 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295650 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295653 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295655 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295658 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:51.296062 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295660 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295663 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295666 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295669 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295671 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295673 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295676 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295678 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295681 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295683 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295685 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295688 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295691 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295693 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295697 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295701 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295703 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295706 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295708 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295711 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:51.296568 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295713 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295716 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295718 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295721 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295724 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295726 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295729 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295731 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295734 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295736 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295739 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295741 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295744 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295746 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295749 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295751 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295754 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295757 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295759 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295761 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:51.297056 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295764 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295767 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295770 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295773 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295775 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.295778 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296156 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296161 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296164 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296167 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296170 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296173 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296176 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296179 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296181 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296184 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296186 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296189 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296191 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296194 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296197 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:51.297574 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296199 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296202 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296204 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296207 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296209 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296212 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296215 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296218 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296220 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296223 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296225 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296229 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296234 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296236 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296239 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296242 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296244 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296247 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296251 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296253 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:51.298086 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296256 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296258 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296261 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296263 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296266 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296268 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296271 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296273 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296275 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296278 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296280 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296283 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296286 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296288 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296291 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296293 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296296 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296298 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296301 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296303 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:51.298732 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296306 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296309 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296311 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296314 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296317 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296320 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296324 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296327 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296329 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296332 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296335 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296338 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296340 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296343 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296346 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296348 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296351 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296353 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296356 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:51.299420 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296358 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296361 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296363 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296366 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296368 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296371 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296373 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296376 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296378 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296381 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296384 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.296386 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297649 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297659 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297666 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297671 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297675 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297679 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297683 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297688 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297691 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:37:51.299929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297695 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297699 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297702 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297706 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297709 2578 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297712 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297715 2578 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297717 2578 flags.go:64] FLAG: --cloud-config="" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297721 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297724 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297729 2578 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297732 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297735 2578 flags.go:64] FLAG: --config-dir="" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297737 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297741 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297745 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297752 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297755 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297758 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297761 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297764 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297767 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297771 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297773 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297778 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:37:51.300480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297781 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297784 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297787 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297790 2578 flags.go:64] FLAG: --enable-server="true" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297793 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297797 2578 flags.go:64] FLAG: --event-burst="100" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297800 2578 flags.go:64] FLAG: --event-qps="50" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297803 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297807 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297810 2578 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297814 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297817 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297820 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297823 2578 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297826 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297829 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297832 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297835 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297838 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297841 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297843 2578 flags.go:64] FLAG: --feature-gates="" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297847 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297851 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297855 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297858 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297861 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:37:51.301092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297864 2578 flags.go:64] FLAG: --help="false" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297867 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297871 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297874 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297877 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297880 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297884 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297887 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297890 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297893 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297896 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297899 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297902 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297905 2578 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297908 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297911 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297915 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297917 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297920 2578 flags.go:64] FLAG: --lock-file="" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297923 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297926 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297929 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297934 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297937 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:37:51.301801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297940 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297943 2578 flags.go:64] FLAG: --logging-format="text" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297946 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297949 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297952 2578 flags.go:64] FLAG: --manifest-url="" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297957 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297961 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297964 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297968 2578 flags.go:64] FLAG: --max-pods="110" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297971 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297975 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297977 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297980 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297983 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297986 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297990 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.297997 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298001 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298003 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298007 2578 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298010 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298015 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298019 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298022 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:37:51.302383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298025 2578 flags.go:64] FLAG: --port="10250" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298028 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298031 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f3321430a2866191" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298034 2578 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298037 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298040 2578 flags.go:64] FLAG: --register-node="true" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298043 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298045 2578 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298049 2578 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298052 2578 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298055 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298058 2578 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298061 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298064 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298069 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298072 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298075 2578 flags.go:64] FLAG: --runonce="false" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298078 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298081 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298084 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298088 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298091 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298094 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298098 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298101 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298105 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:37:51.303024 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298108 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298111 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298113 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298117 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298119 2578 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298122 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298128 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298130 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298133 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298138 2578 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298141 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298144 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298146 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298149 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298152 2578 flags.go:64] FLAG: --v="2" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298156 2578 flags.go:64] FLAG: --version="false" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298160 2578 flags.go:64] FLAG: --vmodule="" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298164 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.298168 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.298945 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.298972 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.298977 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.298983 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.298987 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:51.303765 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.298992 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.298996 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299009 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299013 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299017 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299021 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299026 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299031 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299035 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299039 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299043 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299047 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299052 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299056 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299061 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299070 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299076 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299083 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299088 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299093 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:51.304346 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299097 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299101 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299105 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299110 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299114 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299118 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299123 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299132 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299136 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299140 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299144 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299148 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299152 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299157 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299162 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299167 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299171 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299175 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299180 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299188 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:51.304871 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299194 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299198 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299202 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299206 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299210 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299214 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299218 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299222 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299226 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299230 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299234 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299238 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299247 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299251 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299256 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299260 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299264 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299268 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299276 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299281 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:51.305413 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299304 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299310 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299417 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299423 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299430 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299435 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299441 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299446 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299449 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299453 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299473 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299478 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299482 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299485 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299489 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299494 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299498 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299502 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299506 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299510 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:51.305921 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.299514 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:51.306411 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.300228 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:37:51.306563 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.306545 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:37:51.306596 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.306563 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:37:51.306624 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306608 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:51.306624 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306613 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:51.306624 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306617 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:51.306624 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306620 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:51.306624 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306623 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306627 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306630 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306633 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306636 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306639 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306641 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306645 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306648 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306650 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306653 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306657 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306662 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306664 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306667 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306671 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306674 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306676 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306679 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:51.306749 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306682 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306685 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306687 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306690 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306692 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306695 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306698 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306700 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306703 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306705 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306708 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306711 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306714 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306716 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306719 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306722 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306725 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306727 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306730 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306733 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:51.307214 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306735 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306738 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306741 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306743 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306746 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306749 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306751 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306754 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306757 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306760 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306762 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306765 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306767 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306770 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306773 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306775 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306778 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306780 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306783 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306785 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:51.307806 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306788 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306790 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306794 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306797 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306799 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306802 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306805 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306807 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306810 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306812 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306815 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306817 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306820 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306823 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306825 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306828 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306830 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306834 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306837 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:51.308295 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306841 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306843 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306846 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306849 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.306854 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306952 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306957 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306960 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306963 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306966 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306969 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306973 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306977 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306980 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306983 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:51.308782 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306985 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306988 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306991 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306994 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306997 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.306999 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307002 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307005 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307007 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307010 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307012 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307015 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307017 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307020 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307022 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307026 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307029 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307032 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307034 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307037 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:51.309153 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307039 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307042 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307044 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307051 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307054 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307056 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307059 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307062 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307064 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307067 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307070 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307073 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307075 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307078 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307080 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307083 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307085 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307088 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307090 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307093 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307096 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:51.309658 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307098 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307101 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307104 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307106 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307108 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307111 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307114 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307116 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307119 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307121 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307124 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307126 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307129 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307131 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307134 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307137 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307140 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307142 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307145 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:51.310177 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307147 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307150 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307152 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307155 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307157 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307160 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307163 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307165 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307167 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307170 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307172 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307175 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307177 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307180 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307183 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:51.307185 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:51.310654 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.307190 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:37:51.311043 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.307910 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:37:51.311367 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.311354 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:37:51.312206 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.312195 2578 server.go:1019] "Starting client certificate rotation" Apr 22 18:37:51.312311 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.312292 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:37:51.312368 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.312347 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:37:51.334138 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.334111 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:37:51.338155 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.338018 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:37:51.355006 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.354981 2578 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:37:51.364044 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.364024 2578 log.go:25] "Validated CRI v1 image API" Apr 22 18:37:51.365491 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.365454 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:37:51.366168 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.366149 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:37:51.371075 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.371052 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 bd96fa4e-5473-4c37-92b2-daeb501b75e8:/dev/nvme0n1p4 eec53c91-5a00-42d7-b5e9-2f399dfe56bb:/dev/nvme0n1p3] Apr 22 18:37:51.371145 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.371072 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:37:51.376528 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.376406 2578 manager.go:217] Machine: {Timestamp:2026-04-22 18:37:51.374757483 +0000 UTC m=+0.365024103 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100288 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21dfd84d8e5cd8bff7a1a454162277 SystemUUID:ec21dfd8-4d8e-5cd8-bff7-a1a454162277 BootID:6d8c0375-7aa0-4db2-97ea-99fbbb2892db Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:54:6c:ea:17:ab Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:54:6c:ea:17:ab Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f6:47:a1:e5:c8:d2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:37:51.376528 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.376522 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:37:51.376645 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.376600 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:37:51.376935 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.376916 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:37:51.377075 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.376938 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-151.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:37:51.377119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.377084 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:37:51.377119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.377093 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:37:51.377119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.377106 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:37:51.377767 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.377756 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:37:51.379380 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.379370 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:37:51.379496 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.379487 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:37:51.381420 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.381409 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:37:51.381475 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.381427 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:37:51.381475 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.381439 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:37:51.381475 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.381449 2578 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:37:51.381475 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.381457 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:37:51.382422 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.382407 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:37:51.382480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.382433 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:37:51.384763 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.384742 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tr6xp" Apr 22 18:37:51.385098 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.385074 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:37:51.386653 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.386638 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:37:51.388078 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388065 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:37:51.388151 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388081 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:37:51.388151 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388087 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:37:51.388151 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388093 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:37:51.388151 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388099 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:37:51.388151 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388104 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:37:51.388151 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388110 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:37:51.388151 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388115 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:37:51.388151 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388122 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:37:51.388151 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388128 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:37:51.388151 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388137 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:37:51.388151 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388146 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:37:51.388941 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388928 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:37:51.388941 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.388939 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:37:51.392876 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.392861 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:37:51.392968 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.392896 2578 server.go:1295] "Started kubelet" Apr 22 18:37:51.393026 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.392959 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:37:51.393423 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.393405 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tr6xp" Apr 22 18:37:51.393689 ip-10-0-132-151 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:37:51.393857 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.393698 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:37:51.393923 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.393902 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:37:51.394596 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.394507 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-151.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:37:51.394658 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.394626 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:37:51.394785 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.394662 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-151.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:37:51.395825 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.395809 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:37:51.396227 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.396212 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:37:51.397418 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.396705 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-151.ec2.internal.18a8c1bd35c6f8bb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-151.ec2.internal,UID:ip-10-0-132-151.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-151.ec2.internal,},FirstTimestamp:2026-04-22 18:37:51.392872635 +0000 UTC m=+0.383139254,LastTimestamp:2026-04-22 18:37:51.392872635 +0000 UTC m=+0.383139254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-151.ec2.internal,}" Apr 22 18:37:51.402743 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.402723 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:37:51.403102 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.403084 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:37:51.403626 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.403608 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:37:51.404556 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.404537 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:37:51.404694 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.404542 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:51.404876 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.404855 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:37:51.404966 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.404956 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:37:51.406343 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.405076 2578 factory.go:55] Registering systemd factory Apr 22 18:37:51.406343 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.405102 2578 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:37:51.406343 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.405210 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:37:51.406343 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.405220 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:37:51.406343 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.406193 2578 factory.go:153] Registering CRI-O factory Apr 22 18:37:51.406343 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.406213 2578 factory.go:223] Registration of the crio container factory successfully Apr 22 18:37:51.406343 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.406268 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:37:51.406343 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.406313 2578 factory.go:103] Registering Raw factory Apr 22 18:37:51.406744 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.406359 2578 manager.go:1196] Started watching for new ooms in manager Apr 22 18:37:51.407276 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.407255 2578 manager.go:319] Starting recovery of all containers Apr 22 18:37:51.416534 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.416519 2578 manager.go:324] Recovery completed Apr 22 18:37:51.417060 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.417040 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:51.420444 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.420425 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-151.ec2.internal\" not found" node="ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.421338 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.421319 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:51.424753 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.424737 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:51.424836 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.424772 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:51.424836 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.424787 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:51.425323 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.425311 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:37:51.425323 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.425322 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:37:51.425408 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.425339 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:37:51.427840 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.427827 2578 policy_none.go:49] "None policy: Start" Apr 22 18:37:51.427887 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.427844 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:37:51.427887 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.427855 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:37:51.464093 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.464065 2578 manager.go:341] "Starting Device Plugin manager" Apr 22 18:37:51.472182 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.464110 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:37:51.472182 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.464123 2578 server.go:85] "Starting device plugin registration server" Apr 22 18:37:51.472182 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.464350 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:37:51.472182 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.464363 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:37:51.472182 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.464455 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:37:51.472182 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.464552 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:37:51.472182 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.464560 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:37:51.472182 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.465346 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:37:51.472182 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.465385 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:51.532903 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.532863 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:37:51.534066 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.534032 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:37:51.534066 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.534064 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:37:51.534230 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.534107 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:37:51.534230 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.534113 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:37:51.534230 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.534154 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:37:51.537858 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.537838 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:51.565889 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.565835 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:51.566847 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.566830 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:51.566929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.566858 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:51.566929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.566870 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:51.566929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.566893 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.575429 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.575414 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.575517 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.575435 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-151.ec2.internal\": node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:51.598370 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.598347 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:51.634697 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.634653 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-151.ec2.internal"] Apr 22 18:37:51.634796 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.634748 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:51.636290 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.636272 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:51.636369 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.636305 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:51.636369 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.636318 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:51.637391 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.637378 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:51.637554 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.637541 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.637592 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.637570 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:51.638119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.638101 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:51.638119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.638122 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:51.638245 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.638132 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:51.638245 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.638175 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:51.638245 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.638198 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:51.638245 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.638207 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:51.639152 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.639135 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.639233 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.639164 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:51.639790 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.639775 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:51.639876 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.639803 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:51.639876 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.639818 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:51.672025 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.672004 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-151.ec2.internal\" not found" node="ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.676222 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.676205 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-151.ec2.internal\" not found" node="ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.698702 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.698680 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:51.707226 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.707205 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/43f86013446addac1ce20d0896a63cc3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal\" (UID: \"43f86013446addac1ce20d0896a63cc3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.707287 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.707252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43f86013446addac1ce20d0896a63cc3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal\" (UID: \"43f86013446addac1ce20d0896a63cc3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.707287 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.707273 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0786cfcb4c5f79e61aa898a7fed986dc-config\") pod \"kube-apiserver-proxy-ip-10-0-132-151.ec2.internal\" (UID: \"0786cfcb4c5f79e61aa898a7fed986dc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.799796 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.799764 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:51.808168 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.808147 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/43f86013446addac1ce20d0896a63cc3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal\" (UID: \"43f86013446addac1ce20d0896a63cc3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.808224 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.808103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/43f86013446addac1ce20d0896a63cc3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal\" (UID: \"43f86013446addac1ce20d0896a63cc3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.808255 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.808224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43f86013446addac1ce20d0896a63cc3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal\" (UID: \"43f86013446addac1ce20d0896a63cc3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.808255 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.808242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0786cfcb4c5f79e61aa898a7fed986dc-config\") pod \"kube-apiserver-proxy-ip-10-0-132-151.ec2.internal\" (UID: \"0786cfcb4c5f79e61aa898a7fed986dc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.808314 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.808277 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0786cfcb4c5f79e61aa898a7fed986dc-config\") pod \"kube-apiserver-proxy-ip-10-0-132-151.ec2.internal\" (UID: \"0786cfcb4c5f79e61aa898a7fed986dc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.808314 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.808299 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43f86013446addac1ce20d0896a63cc3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal\" (UID: \"43f86013446addac1ce20d0896a63cc3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.900540 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:51.900514 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:51.975059 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.975037 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" Apr 22 18:37:51.979126 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:51.979106 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-151.ec2.internal" Apr 22 18:37:52.001035 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:52.001003 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:52.101556 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:52.101512 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:52.202157 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:52.202070 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:52.234063 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.234038 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:52.302992 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:52.302959 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:52.313407 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.313386 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:37:52.313573 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.313552 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:37:52.313613 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.313574 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:37:52.313613 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.313592 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:37:52.395305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.395260 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:32:51 +0000 UTC" deadline="2027-11-26 04:29:47.595612338 +0000 UTC" Apr 22 18:37:52.395305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.395296 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13977h51m55.200319136s" Apr 22 18:37:52.403827 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:52.403796 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:52.403827 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.403821 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:37:52.414034 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.414007 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:37:52.432618 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.432595 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-m25hh" Apr 22 18:37:52.440653 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.440630 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-m25hh" Apr 22 18:37:52.504091 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:52.504030 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:52.603598 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:52.603564 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0786cfcb4c5f79e61aa898a7fed986dc.slice/crio-59bdf175c9f35769edb008f063787f4ef7c95c79d24e4b4696aa790110006940 WatchSource:0}: Error finding container 59bdf175c9f35769edb008f063787f4ef7c95c79d24e4b4696aa790110006940: Status 404 returned error can't find the container with id 59bdf175c9f35769edb008f063787f4ef7c95c79d24e4b4696aa790110006940 Apr 22 18:37:52.603853 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:52.603829 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43f86013446addac1ce20d0896a63cc3.slice/crio-a8520e7ae6afff574f7c6646a47270b8a9a3356c329b2c49946c241740615a9c WatchSource:0}: Error finding container a8520e7ae6afff574f7c6646a47270b8a9a3356c329b2c49946c241740615a9c: Status 404 returned error can't find the container with id a8520e7ae6afff574f7c6646a47270b8a9a3356c329b2c49946c241740615a9c Apr 22 18:37:52.604926 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:52.604910 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:52.608531 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.608515 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:37:52.705513 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:52.705452 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:52.805950 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:52.805868 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:52.906617 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:52.906580 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-151.ec2.internal\" not found" Apr 22 18:37:52.934198 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:52.934163 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:53.005003 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.004972 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" Apr 22 18:37:53.015869 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.015843 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:37:53.016651 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.016634 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-151.ec2.internal" Apr 22 18:37:53.026156 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.026125 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:37:53.321394 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.321365 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:53.382617 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.382591 2578 apiserver.go:52] "Watching apiserver" Apr 22 18:37:53.390381 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.390342 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:37:53.391720 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.391690 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-gjdjf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm","openshift-cluster-node-tuning-operator/tuned-sw4px","openshift-dns/node-resolver-htng9","openshift-image-registry/node-ca-8vfmp","openshift-multus/network-metrics-daemon-bkhdr","openshift-network-diagnostics/network-check-target-fqfx5","openshift-ovn-kubernetes/ovnkube-node-f2rx9","kube-system/kube-apiserver-proxy-ip-10-0-132-151.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal","openshift-multus/multus-additional-cni-plugins-772rv","openshift-multus/multus-kh2qh","openshift-network-operator/iptables-alerter-c2nqs"] Apr 22 18:37:53.393844 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.393823 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:37:53.393946 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:53.393895 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:37:53.394927 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.394897 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.396141 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.396077 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.397214 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.397133 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:37:53.397321 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.397223 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zn9tj\"" Apr 22 18:37:53.397321 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.397313 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:37:53.397451 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.397427 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8vfmp" Apr 22 18:37:53.397531 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.397475 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:37:53.398398 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.398181 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:37:53.398693 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.398669 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kb59n\"" Apr 22 18:37:53.401875 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.399327 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:37:53.401875 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.399440 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:37:53.401875 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:53.399564 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:37:53.401875 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.399691 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-htng9" Apr 22 18:37:53.403220 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.402834 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:37:53.403220 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.402843 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pchrd\"" Apr 22 18:37:53.403220 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.402946 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:37:53.403220 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.403006 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gjdjf" Apr 22 18:37:53.403220 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.403037 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pjkgq\"" Apr 22 18:37:53.403220 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.403052 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:37:53.403220 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.403159 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:37:53.403718 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.403434 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:37:53.404861 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.404706 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.405251 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.404993 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.405661 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.405636 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:37:53.406035 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.406017 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:37:53.406843 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.406824 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.407541 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.407037 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:37:53.407541 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.407074 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:37:53.407541 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.407255 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:37:53.407541 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.407275 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:37:53.407541 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.407349 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:37:53.407541 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.407382 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:37:53.407541 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.407429 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2wpvl\"" Apr 22 18:37:53.407541 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.407436 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zclmc\"" Apr 22 18:37:53.408662 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.408208 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dmt8l\"" Apr 22 18:37:53.408662 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.408332 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:37:53.408662 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.408414 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:37:53.408662 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.408568 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:37:53.408891 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.408691 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:37:53.408891 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.408704 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:37:53.409027 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.408990 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c2nqs" Apr 22 18:37:53.410316 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.410290 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:37:53.410543 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.410528 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-zjrcb\"" Apr 22 18:37:53.411576 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.411557 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:37:53.411733 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.411559 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:37:53.411809 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.411682 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:37:53.411809 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.411724 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r2fpj\"" Apr 22 18:37:53.418210 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-modprobe-d\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.418315 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418223 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-run-openvswitch\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.418315 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418249 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/14a4227e-0dba-42ae-a250-c23eb012b836-ovnkube-script-lib\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.418315 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418275 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sszrj\" (UniqueName: \"kubernetes.io/projected/5cac15d4-0328-41c3-8bbc-e6d020fb09d2-kube-api-access-sszrj\") pod \"node-ca-8vfmp\" (UID: \"5cac15d4-0328-41c3-8bbc-e6d020fb09d2\") " pod="openshift-image-registry/node-ca-8vfmp" Apr 22 18:37:53.418315 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418299 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-multus-cni-dir\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.418552 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-os-release\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.418552 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418344 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-run-netns\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.418552 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418367 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-multus-conf-dir\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.418552 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-sys\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.418552 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418416 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-var-lib-kubelet\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.418552 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-host\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.418552 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418484 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph47r\" (UniqueName: \"kubernetes.io/projected/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-kube-api-access-ph47r\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.418552 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418507 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14a4227e-0dba-42ae-a250-c23eb012b836-env-overrides\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.418552 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418532 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/940a66b8-963f-4a92-adee-fd47c60355d9-agent-certs\") pod \"konnectivity-agent-gjdjf\" (UID: \"940a66b8-963f-4a92-adee-fd47c60355d9\") " pod="kube-system/konnectivity-agent-gjdjf" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418557 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f748f72b-69ca-4686-b516-08b6a1e0e7a1-os-release\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418582 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq7nz\" (UniqueName: \"kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz\") pod \"network-check-target-fqfx5\" (UID: \"33339752-7b6a-4821-a16d-b079144b080d\") " pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418605 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-lib-modules\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-run-k8s-cni-cncf-io\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418684 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrs4f\" (UniqueName: \"kubernetes.io/projected/96f11a65-b703-4a16-911b-e932a7067c05-kube-api-access-qrs4f\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418708 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-cni-bin\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-socket-dir\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418754 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5cac15d4-0328-41c3-8bbc-e6d020fb09d2-host\") pod \"node-ca-8vfmp\" (UID: \"5cac15d4-0328-41c3-8bbc-e6d020fb09d2\") " pod="openshift-image-registry/node-ca-8vfmp" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418777 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-system-cni-dir\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418800 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96f11a65-b703-4a16-911b-e932a7067c05-cni-binary-copy\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418823 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-kubernetes\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418852 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-slash\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418875 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-etc-selinux\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418899 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/940a66b8-963f-4a92-adee-fd47c60355d9-konnectivity-ca\") pod \"konnectivity-agent-gjdjf\" (UID: \"940a66b8-963f-4a92-adee-fd47c60355d9\") " pod="kube-system/konnectivity-agent-gjdjf" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418923 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f748f72b-69ca-4686-b516-08b6a1e0e7a1-system-cni-dir\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.418963 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-cnibin\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.419667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418974 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8ph5\" (UniqueName: \"kubernetes.io/projected/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-kube-api-access-g8ph5\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:37:53.419667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.418998 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14a4227e-0dba-42ae-a250-c23eb012b836-ovnkube-config\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.419667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419024 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14a4227e-0dba-42ae-a250-c23eb012b836-ovn-node-metrics-cert\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.419667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-hostroot\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.419667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419093 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-systemd-units\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.419667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419117 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.419667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419175 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8h7t\" (UniqueName: \"kubernetes.io/projected/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-kube-api-access-w8h7t\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.419667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419202 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f748f72b-69ca-4686-b516-08b6a1e0e7a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.419667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419233 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdfzw\" (UniqueName: \"kubernetes.io/projected/f748f72b-69ca-4686-b516-08b6a1e0e7a1-kube-api-access-fdfzw\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.419667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419276 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-run\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.420088 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419724 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-run-systemd\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.420088 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419788 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-node-log\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.420088 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-sys-fs\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.420088 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419843 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5cac15d4-0328-41c3-8bbc-e6d020fb09d2-serviceca\") pod \"node-ca-8vfmp\" (UID: \"5cac15d4-0328-41c3-8bbc-e6d020fb09d2\") " pod="openshift-image-registry/node-ca-8vfmp" Apr 22 18:37:53.420088 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419866 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-tuned\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.420088 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-run-netns\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.420088 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7phk4\" (UniqueName: \"kubernetes.io/projected/b0d1706a-8c90-4a85-abfe-07cf53827364-kube-api-access-7phk4\") pod \"node-resolver-htng9\" (UID: \"b0d1706a-8c90-4a85-abfe-07cf53827364\") " pod="openshift-dns/node-resolver-htng9" Apr 22 18:37:53.420088 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.419983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-var-lib-cni-bin\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.420088 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420007 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-var-lib-kubelet\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.420088 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420042 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-var-lib-openvswitch\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.420088 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-sysctl-conf\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-systemd\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420153 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-run-ovn\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420182 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-registration-dir\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-etc-kubernetes\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-sysctl-d\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420293 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-etc-openvswitch\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420321 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-log-socket\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420344 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0d1706a-8c90-4a85-abfe-07cf53827364-tmp-dir\") pod \"node-resolver-htng9\" (UID: \"b0d1706a-8c90-4a85-abfe-07cf53827364\") " pod="openshift-dns/node-resolver-htng9" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f748f72b-69ca-4686-b516-08b6a1e0e7a1-cnibin\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420390 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f748f72b-69ca-4686-b516-08b6a1e0e7a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420407 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/96f11a65-b703-4a16-911b-e932a7067c05-multus-daemon-config\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420421 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-tmp\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420487 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-multus-socket-dir-parent\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420527 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-sysconfig\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420551 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-cni-netd\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.420635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420574 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0d1706a-8c90-4a85-abfe-07cf53827364-hosts-file\") pod \"node-resolver-htng9\" (UID: \"b0d1706a-8c90-4a85-abfe-07cf53827364\") " pod="openshift-dns/node-resolver-htng9" Apr 22 18:37:53.421309 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420598 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-run-multus-certs\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.421309 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420621 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-kubelet\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.421309 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420647 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.421309 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420673 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f748f72b-69ca-4686-b516-08b6a1e0e7a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.421309 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420698 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47j2n\" (UniqueName: \"kubernetes.io/projected/14a4227e-0dba-42ae-a250-c23eb012b836-kube-api-access-47j2n\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.421309 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.421309 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420746 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-device-dir\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.421309 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f748f72b-69ca-4686-b516-08b6a1e0e7a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.421309 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.420795 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-var-lib-cni-multus\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.442715 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.442654 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:32:52 +0000 UTC" deadline="2027-12-31 21:06:23.99953781 +0000 UTC" Apr 22 18:37:53.442715 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.442692 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14834h28m30.556849385s" Apr 22 18:37:53.505986 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.505958 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:37:53.521800 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.521777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-systemd-units\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.521940 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.521804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.521940 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.521827 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8h7t\" (UniqueName: \"kubernetes.io/projected/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-kube-api-access-w8h7t\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.521940 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.521853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f748f72b-69ca-4686-b516-08b6a1e0e7a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.521940 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.521875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdfzw\" (UniqueName: \"kubernetes.io/projected/f748f72b-69ca-4686-b516-08b6a1e0e7a1-kube-api-access-fdfzw\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.521940 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.521885 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-systemd-units\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.521940 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.521890 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.521940 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.521897 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b331f320-606f-47f7-835d-668250b816bc-host-slash\") pod \"iptables-alerter-c2nqs\" (UID: \"b331f320-606f-47f7-835d-668250b816bc\") " pod="openshift-network-operator/iptables-alerter-c2nqs" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.521955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-run\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.521982 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-run-systemd\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522009 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-node-log\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522030 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f748f72b-69ca-4686-b516-08b6a1e0e7a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522054 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-node-log\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522077 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-run\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-sys-fs\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522151 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-run-systemd\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5cac15d4-0328-41c3-8bbc-e6d020fb09d2-serviceca\") pod \"node-ca-8vfmp\" (UID: \"5cac15d4-0328-41c3-8bbc-e6d020fb09d2\") " pod="openshift-image-registry/node-ca-8vfmp" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-tuned\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-sys-fs\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522208 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-run-netns\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522235 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7phk4\" (UniqueName: \"kubernetes.io/projected/b0d1706a-8c90-4a85-abfe-07cf53827364-kube-api-access-7phk4\") pod \"node-resolver-htng9\" (UID: \"b0d1706a-8c90-4a85-abfe-07cf53827364\") " pod="openshift-dns/node-resolver-htng9" Apr 22 18:37:53.522266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-var-lib-cni-bin\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522297 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-run-netns\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522311 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-var-lib-cni-bin\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-var-lib-kubelet\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-var-lib-openvswitch\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-sysctl-conf\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522381 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-var-lib-kubelet\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-systemd\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522439 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-var-lib-openvswitch\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-systemd\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522504 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-run-ovn\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522532 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-sysctl-conf\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-registration-dir\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522576 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-run-ovn\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-etc-kubernetes\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522558 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522608 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5zm7\" (UniqueName: \"kubernetes.io/projected/b331f320-606f-47f7-835d-668250b816bc-kube-api-access-m5zm7\") pod \"iptables-alerter-c2nqs\" (UID: \"b331f320-606f-47f7-835d-668250b816bc\") " pod="openshift-network-operator/iptables-alerter-c2nqs" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5cac15d4-0328-41c3-8bbc-e6d020fb09d2-serviceca\") pod \"node-ca-8vfmp\" (UID: \"5cac15d4-0328-41c3-8bbc-e6d020fb09d2\") " pod="openshift-image-registry/node-ca-8vfmp" Apr 22 18:37:53.522929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-etc-kubernetes\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-registration-dir\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522625 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-sysctl-d\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522677 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-etc-openvswitch\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-log-socket\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522725 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-sysctl-d\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522726 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-etc-openvswitch\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0d1706a-8c90-4a85-abfe-07cf53827364-tmp-dir\") pod \"node-resolver-htng9\" (UID: \"b0d1706a-8c90-4a85-abfe-07cf53827364\") " pod="openshift-dns/node-resolver-htng9" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522749 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-log-socket\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522763 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f748f72b-69ca-4686-b516-08b6a1e0e7a1-cnibin\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f748f72b-69ca-4686-b516-08b6a1e0e7a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/96f11a65-b703-4a16-911b-e932a7067c05-multus-daemon-config\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522815 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f748f72b-69ca-4686-b516-08b6a1e0e7a1-cnibin\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-tmp\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522889 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-multus-socket-dir-parent\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b331f320-606f-47f7-835d-668250b816bc-iptables-alerter-script\") pod \"iptables-alerter-c2nqs\" (UID: \"b331f320-606f-47f7-835d-668250b816bc\") " pod="openshift-network-operator/iptables-alerter-c2nqs" Apr 22 18:37:53.523624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-sysconfig\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.522978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-cni-netd\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:53.522998 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523009 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0d1706a-8c90-4a85-abfe-07cf53827364-tmp-dir\") pod \"node-resolver-htng9\" (UID: \"b0d1706a-8c90-4a85-abfe-07cf53827364\") " pod="openshift-dns/node-resolver-htng9" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523046 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0d1706a-8c90-4a85-abfe-07cf53827364-hosts-file\") pod \"node-resolver-htng9\" (UID: \"b0d1706a-8c90-4a85-abfe-07cf53827364\") " pod="openshift-dns/node-resolver-htng9" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:53.523090 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs podName:9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f nodeName:}" failed. No retries permitted until 2026-04-22 18:37:54.023052439 +0000 UTC m=+3.013319066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs") pod "network-metrics-daemon-bkhdr" (UID: "9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-sysconfig\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-multus-socket-dir-parent\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0d1706a-8c90-4a85-abfe-07cf53827364-hosts-file\") pod \"node-resolver-htng9\" (UID: \"b0d1706a-8c90-4a85-abfe-07cf53827364\") " pod="openshift-dns/node-resolver-htng9" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-cni-netd\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-run-multus-certs\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523165 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-kubelet\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523190 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f748f72b-69ca-4686-b516-08b6a1e0e7a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47j2n\" (UniqueName: \"kubernetes.io/projected/14a4227e-0dba-42ae-a250-c23eb012b836-kube-api-access-47j2n\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523292 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-device-dir\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.524305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523317 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523317 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f748f72b-69ca-4686-b516-08b6a1e0e7a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523356 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-var-lib-cni-multus\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-run-multus-certs\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523366 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/96f11a65-b703-4a16-911b-e932a7067c05-multus-daemon-config\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-modprobe-d\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-run-openvswitch\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523423 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-kubelet\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523433 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/14a4227e-0dba-42ae-a250-c23eb012b836-ovnkube-script-lib\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523475 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sszrj\" (UniqueName: \"kubernetes.io/projected/5cac15d4-0328-41c3-8bbc-e6d020fb09d2-kube-api-access-sszrj\") pod \"node-ca-8vfmp\" (UID: \"5cac15d4-0328-41c3-8bbc-e6d020fb09d2\") " pod="openshift-image-registry/node-ca-8vfmp" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523499 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-multus-cni-dir\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523512 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-modprobe-d\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523522 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-os-release\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-run-netns\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523544 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-device-dir\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523592 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-run-netns\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-var-lib-cni-multus\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523610 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-multus-cni-dir\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.525121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523627 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-multus-conf-dir\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523653 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-sys\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523660 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-run-openvswitch\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523366 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f748f72b-69ca-4686-b516-08b6a1e0e7a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-var-lib-kubelet\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523711 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-multus-conf-dir\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523713 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-os-release\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523730 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-var-lib-kubelet\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-host\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ph47r\" (UniqueName: \"kubernetes.io/projected/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-kube-api-access-ph47r\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523770 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-sys\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523787 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-host\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14a4227e-0dba-42ae-a250-c23eb012b836-env-overrides\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523824 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/940a66b8-963f-4a92-adee-fd47c60355d9-agent-certs\") pod \"konnectivity-agent-gjdjf\" (UID: \"940a66b8-963f-4a92-adee-fd47c60355d9\") " pod="kube-system/konnectivity-agent-gjdjf" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523848 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f748f72b-69ca-4686-b516-08b6a1e0e7a1-os-release\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7nz\" (UniqueName: \"kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz\") pod \"network-check-target-fqfx5\" (UID: \"33339752-7b6a-4821-a16d-b079144b080d\") " pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523897 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-lib-modules\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.525939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-run-k8s-cni-cncf-io\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.523937 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f748f72b-69ca-4686-b516-08b6a1e0e7a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524011 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-lib-modules\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524013 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f748f72b-69ca-4686-b516-08b6a1e0e7a1-os-release\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524055 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-host-run-k8s-cni-cncf-io\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrs4f\" (UniqueName: \"kubernetes.io/projected/96f11a65-b703-4a16-911b-e932a7067c05-kube-api-access-qrs4f\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524246 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-cni-bin\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524277 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/14a4227e-0dba-42ae-a250-c23eb012b836-ovnkube-script-lib\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-socket-dir\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524305 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-cni-bin\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5cac15d4-0328-41c3-8bbc-e6d020fb09d2-host\") pod \"node-ca-8vfmp\" (UID: \"5cac15d4-0328-41c3-8bbc-e6d020fb09d2\") " pod="openshift-image-registry/node-ca-8vfmp" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524356 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-system-cni-dir\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96f11a65-b703-4a16-911b-e932a7067c05-cni-binary-copy\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524384 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5cac15d4-0328-41c3-8bbc-e6d020fb09d2-host\") pod \"node-ca-8vfmp\" (UID: \"5cac15d4-0328-41c3-8bbc-e6d020fb09d2\") " pod="openshift-image-registry/node-ca-8vfmp" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524387 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14a4227e-0dba-42ae-a250-c23eb012b836-env-overrides\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524405 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-kubernetes\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524430 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-system-cni-dir\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524433 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-slash\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.526594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524448 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-kubernetes\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524484 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14a4227e-0dba-42ae-a250-c23eb012b836-host-slash\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524482 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-etc-selinux\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/940a66b8-963f-4a92-adee-fd47c60355d9-konnectivity-ca\") pod \"konnectivity-agent-gjdjf\" (UID: \"940a66b8-963f-4a92-adee-fd47c60355d9\") " pod="kube-system/konnectivity-agent-gjdjf" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f748f72b-69ca-4686-b516-08b6a1e0e7a1-system-cni-dir\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524605 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-etc-selinux\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-cnibin\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8ph5\" (UniqueName: \"kubernetes.io/projected/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-kube-api-access-g8ph5\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524673 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14a4227e-0dba-42ae-a250-c23eb012b836-ovnkube-config\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14a4227e-0dba-42ae-a250-c23eb012b836-ovn-node-metrics-cert\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524721 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-hostroot\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-socket-dir\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-cnibin\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.524877 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/96f11a65-b703-4a16-911b-e932a7067c05-hostroot\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.525016 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f748f72b-69ca-4686-b516-08b6a1e0e7a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.525184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96f11a65-b703-4a16-911b-e932a7067c05-cni-binary-copy\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.525266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f748f72b-69ca-4686-b516-08b6a1e0e7a1-system-cni-dir\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.525381 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/940a66b8-963f-4a92-adee-fd47c60355d9-konnectivity-ca\") pod \"konnectivity-agent-gjdjf\" (UID: \"940a66b8-963f-4a92-adee-fd47c60355d9\") " pod="kube-system/konnectivity-agent-gjdjf" Apr 22 18:37:53.527268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.525525 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14a4227e-0dba-42ae-a250-c23eb012b836-ovnkube-config\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.527935 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.526024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-etc-tuned\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.527935 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.526188 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-tmp\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.527935 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.527557 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14a4227e-0dba-42ae-a250-c23eb012b836-ovn-node-metrics-cert\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.527935 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.527596 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/940a66b8-963f-4a92-adee-fd47c60355d9-agent-certs\") pod \"konnectivity-agent-gjdjf\" (UID: \"940a66b8-963f-4a92-adee-fd47c60355d9\") " pod="kube-system/konnectivity-agent-gjdjf" Apr 22 18:37:53.532314 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:53.532075 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:53.532314 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:53.532096 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:53.532314 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:53.532110 2578 projected.go:194] Error preparing data for projected volume kube-api-access-zq7nz for pod openshift-network-diagnostics/network-check-target-fqfx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:53.532314 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:53.532175 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz podName:33339752-7b6a-4821-a16d-b079144b080d nodeName:}" failed. No retries permitted until 2026-04-22 18:37:54.032156237 +0000 UTC m=+3.022422867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zq7nz" (UniqueName: "kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz") pod "network-check-target-fqfx5" (UID: "33339752-7b6a-4821-a16d-b079144b080d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:53.532614 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.532338 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdfzw\" (UniqueName: \"kubernetes.io/projected/f748f72b-69ca-4686-b516-08b6a1e0e7a1-kube-api-access-fdfzw\") pod \"multus-additional-cni-plugins-772rv\" (UID: \"f748f72b-69ca-4686-b516-08b6a1e0e7a1\") " pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.533521 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.533289 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8h7t\" (UniqueName: \"kubernetes.io/projected/71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da-kube-api-access-w8h7t\") pod \"aws-ebs-csi-driver-node-4dmvm\" (UID: \"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.534758 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.534443 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph47r\" (UniqueName: \"kubernetes.io/projected/8e2442ec-04ff-4f4c-939c-bc2943aa01cf-kube-api-access-ph47r\") pod \"tuned-sw4px\" (UID: \"8e2442ec-04ff-4f4c-939c-bc2943aa01cf\") " pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.534758 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.534688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47j2n\" (UniqueName: \"kubernetes.io/projected/14a4227e-0dba-42ae-a250-c23eb012b836-kube-api-access-47j2n\") pod \"ovnkube-node-f2rx9\" (UID: \"14a4227e-0dba-42ae-a250-c23eb012b836\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.534921 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.534837 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sszrj\" (UniqueName: \"kubernetes.io/projected/5cac15d4-0328-41c3-8bbc-e6d020fb09d2-kube-api-access-sszrj\") pod \"node-ca-8vfmp\" (UID: \"5cac15d4-0328-41c3-8bbc-e6d020fb09d2\") " pod="openshift-image-registry/node-ca-8vfmp" Apr 22 18:37:53.534980 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.534940 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7phk4\" (UniqueName: \"kubernetes.io/projected/b0d1706a-8c90-4a85-abfe-07cf53827364-kube-api-access-7phk4\") pod \"node-resolver-htng9\" (UID: \"b0d1706a-8c90-4a85-abfe-07cf53827364\") " pod="openshift-dns/node-resolver-htng9" Apr 22 18:37:53.535657 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.535627 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8ph5\" (UniqueName: \"kubernetes.io/projected/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-kube-api-access-g8ph5\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:37:53.536440 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.536417 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrs4f\" (UniqueName: \"kubernetes.io/projected/96f11a65-b703-4a16-911b-e932a7067c05-kube-api-access-qrs4f\") pod \"multus-kh2qh\" (UID: \"96f11a65-b703-4a16-911b-e932a7067c05\") " pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.539885 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.539845 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-151.ec2.internal" event={"ID":"0786cfcb4c5f79e61aa898a7fed986dc","Type":"ContainerStarted","Data":"59bdf175c9f35769edb008f063787f4ef7c95c79d24e4b4696aa790110006940"} Apr 22 18:37:53.541013 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.540994 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" event={"ID":"43f86013446addac1ce20d0896a63cc3","Type":"ContainerStarted","Data":"a8520e7ae6afff574f7c6646a47270b8a9a3356c329b2c49946c241740615a9c"} Apr 22 18:37:53.577950 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.577886 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:53.626001 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.625968 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5zm7\" (UniqueName: \"kubernetes.io/projected/b331f320-606f-47f7-835d-668250b816bc-kube-api-access-m5zm7\") pod \"iptables-alerter-c2nqs\" (UID: \"b331f320-606f-47f7-835d-668250b816bc\") " pod="openshift-network-operator/iptables-alerter-c2nqs" Apr 22 18:37:53.626142 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.626031 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b331f320-606f-47f7-835d-668250b816bc-iptables-alerter-script\") pod \"iptables-alerter-c2nqs\" (UID: \"b331f320-606f-47f7-835d-668250b816bc\") " pod="openshift-network-operator/iptables-alerter-c2nqs" Apr 22 18:37:53.626142 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.626100 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b331f320-606f-47f7-835d-668250b816bc-host-slash\") pod \"iptables-alerter-c2nqs\" (UID: \"b331f320-606f-47f7-835d-668250b816bc\") " pod="openshift-network-operator/iptables-alerter-c2nqs" Apr 22 18:37:53.626250 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.626169 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b331f320-606f-47f7-835d-668250b816bc-host-slash\") pod \"iptables-alerter-c2nqs\" (UID: \"b331f320-606f-47f7-835d-668250b816bc\") " pod="openshift-network-operator/iptables-alerter-c2nqs" Apr 22 18:37:53.626586 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.626564 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b331f320-606f-47f7-835d-668250b816bc-iptables-alerter-script\") pod \"iptables-alerter-c2nqs\" (UID: \"b331f320-606f-47f7-835d-668250b816bc\") " pod="openshift-network-operator/iptables-alerter-c2nqs" Apr 22 18:37:53.634889 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.634868 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5zm7\" (UniqueName: \"kubernetes.io/projected/b331f320-606f-47f7-835d-668250b816bc-kube-api-access-m5zm7\") pod \"iptables-alerter-c2nqs\" (UID: \"b331f320-606f-47f7-835d-668250b816bc\") " pod="openshift-network-operator/iptables-alerter-c2nqs" Apr 22 18:37:53.712029 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.711994 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" Apr 22 18:37:53.719489 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.719447 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sw4px" Apr 22 18:37:53.729276 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.729251 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8vfmp" Apr 22 18:37:53.734844 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.734817 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-htng9" Apr 22 18:37:53.742409 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.742389 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gjdjf" Apr 22 18:37:53.748915 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.748897 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-772rv" Apr 22 18:37:53.756502 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.756477 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:37:53.764139 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.764112 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kh2qh" Apr 22 18:37:53.770738 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:53.770715 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c2nqs" Apr 22 18:37:54.029499 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.029421 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:37:54.029626 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:54.029534 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:54.029626 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:54.029587 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs podName:9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f nodeName:}" failed. No retries permitted until 2026-04-22 18:37:55.029571675 +0000 UTC m=+4.019838282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs") pod "network-metrics-daemon-bkhdr" (UID: "9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:54.130581 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.130546 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7nz\" (UniqueName: \"kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz\") pod \"network-check-target-fqfx5\" (UID: \"33339752-7b6a-4821-a16d-b079144b080d\") " pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:37:54.130768 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:54.130735 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:54.130768 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:54.130755 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:54.130768 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:54.130767 2578 projected.go:194] Error preparing data for projected volume kube-api-access-zq7nz for pod openshift-network-diagnostics/network-check-target-fqfx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:54.130922 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:54.130824 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz podName:33339752-7b6a-4821-a16d-b079144b080d nodeName:}" failed. No retries permitted until 2026-04-22 18:37:55.130807813 +0000 UTC m=+4.121074425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zq7nz" (UniqueName: "kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz") pod "network-check-target-fqfx5" (UID: "33339752-7b6a-4821-a16d-b079144b080d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:54.239646 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:54.239619 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e2442ec_04ff_4f4c_939c_bc2943aa01cf.slice/crio-af35f02ad74178c444c6d3e62fa0ac1fd9ae62bf2b48b32835842a4de25a3219 WatchSource:0}: Error finding container af35f02ad74178c444c6d3e62fa0ac1fd9ae62bf2b48b32835842a4de25a3219: Status 404 returned error can't find the container with id af35f02ad74178c444c6d3e62fa0ac1fd9ae62bf2b48b32835842a4de25a3219 Apr 22 18:37:54.257433 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:54.257406 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71f5fcb3_52cb_4b9a_acbe_8b62fc2d17da.slice/crio-43f7f2d2149dc2c05236440a113840ad37914d2cb325c32a29a4f41fc21b4c82 WatchSource:0}: Error finding container 43f7f2d2149dc2c05236440a113840ad37914d2cb325c32a29a4f41fc21b4c82: Status 404 returned error can't find the container with id 43f7f2d2149dc2c05236440a113840ad37914d2cb325c32a29a4f41fc21b4c82 Apr 22 18:37:54.258156 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:54.258129 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb331f320_606f_47f7_835d_668250b816bc.slice/crio-115a4cebdedcb06dd2dfe0d48fd1c68246f1b26e598b954ef72973c5da108eba WatchSource:0}: Error finding container 115a4cebdedcb06dd2dfe0d48fd1c68246f1b26e598b954ef72973c5da108eba: Status 404 returned error can't find the container with id 115a4cebdedcb06dd2dfe0d48fd1c68246f1b26e598b954ef72973c5da108eba Apr 22 18:37:54.258987 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:54.258871 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cac15d4_0328_41c3_8bbc_e6d020fb09d2.slice/crio-1b4d58181d39152db07bcaec06100f47532a2b5018a44942e0a7ff1cd8604aa9 WatchSource:0}: Error finding container 1b4d58181d39152db07bcaec06100f47532a2b5018a44942e0a7ff1cd8604aa9: Status 404 returned error can't find the container with id 1b4d58181d39152db07bcaec06100f47532a2b5018a44942e0a7ff1cd8604aa9 Apr 22 18:37:54.261048 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:54.260659 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod940a66b8_963f_4a92_adee_fd47c60355d9.slice/crio-10a3224181b7249ebb147453a0dc6e6c0b63a3ddcc5b13a39d6212a5498fff03 WatchSource:0}: Error finding container 10a3224181b7249ebb147453a0dc6e6c0b63a3ddcc5b13a39d6212a5498fff03: Status 404 returned error can't find the container with id 10a3224181b7249ebb147453a0dc6e6c0b63a3ddcc5b13a39d6212a5498fff03 Apr 22 18:37:54.262554 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:54.262388 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96f11a65_b703_4a16_911b_e932a7067c05.slice/crio-4784d2f70929d3ea4f1ae0cb734dbb9d82cc2cb5728b5748ac0f501624264e96 WatchSource:0}: Error finding container 4784d2f70929d3ea4f1ae0cb734dbb9d82cc2cb5728b5748ac0f501624264e96: Status 404 returned error can't find the container with id 4784d2f70929d3ea4f1ae0cb734dbb9d82cc2cb5728b5748ac0f501624264e96 Apr 22 18:37:54.263065 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:54.262968 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a4227e_0dba_42ae_a250_c23eb012b836.slice/crio-40779841e1d7e2c1aad472f732167666dba5fd5995ece6392f3278c2d27884df WatchSource:0}: Error finding container 40779841e1d7e2c1aad472f732167666dba5fd5995ece6392f3278c2d27884df: Status 404 returned error can't find the container with id 40779841e1d7e2c1aad472f732167666dba5fd5995ece6392f3278c2d27884df Apr 22 18:37:54.264314 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:54.264290 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d1706a_8c90_4a85_abfe_07cf53827364.slice/crio-59a2f322a16d305f2f07f11019772e31d5d26c05d39279663cd6e1f97d890a7a WatchSource:0}: Error finding container 59a2f322a16d305f2f07f11019772e31d5d26c05d39279663cd6e1f97d890a7a: Status 404 returned error can't find the container with id 59a2f322a16d305f2f07f11019772e31d5d26c05d39279663cd6e1f97d890a7a Apr 22 18:37:54.264954 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:37:54.264936 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf748f72b_69ca_4686_b516_08b6a1e0e7a1.slice/crio-25209e40e59e7ea73f9b569063a16d8e99e1bd2df27bb13e12344fa34920f64e WatchSource:0}: Error finding container 25209e40e59e7ea73f9b569063a16d8e99e1bd2df27bb13e12344fa34920f64e: Status 404 returned error can't find the container with id 25209e40e59e7ea73f9b569063a16d8e99e1bd2df27bb13e12344fa34920f64e Apr 22 18:37:54.442883 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.442847 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:32:52 +0000 UTC" deadline="2027-12-09 03:09:22.593669104 +0000 UTC" Apr 22 18:37:54.442883 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.442877 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14288h31m28.150794756s" Apr 22 18:37:54.543349 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.543272 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-772rv" event={"ID":"f748f72b-69ca-4686-b516-08b6a1e0e7a1","Type":"ContainerStarted","Data":"25209e40e59e7ea73f9b569063a16d8e99e1bd2df27bb13e12344fa34920f64e"} Apr 22 18:37:54.544141 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.544114 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-htng9" event={"ID":"b0d1706a-8c90-4a85-abfe-07cf53827364","Type":"ContainerStarted","Data":"59a2f322a16d305f2f07f11019772e31d5d26c05d39279663cd6e1f97d890a7a"} Apr 22 18:37:54.545594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.545561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" event={"ID":"14a4227e-0dba-42ae-a250-c23eb012b836","Type":"ContainerStarted","Data":"40779841e1d7e2c1aad472f732167666dba5fd5995ece6392f3278c2d27884df"} Apr 22 18:37:54.546504 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.546479 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gjdjf" event={"ID":"940a66b8-963f-4a92-adee-fd47c60355d9","Type":"ContainerStarted","Data":"10a3224181b7249ebb147453a0dc6e6c0b63a3ddcc5b13a39d6212a5498fff03"} Apr 22 18:37:54.547416 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.547395 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c2nqs" event={"ID":"b331f320-606f-47f7-835d-668250b816bc","Type":"ContainerStarted","Data":"115a4cebdedcb06dd2dfe0d48fd1c68246f1b26e598b954ef72973c5da108eba"} Apr 22 18:37:54.548289 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.548269 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sw4px" event={"ID":"8e2442ec-04ff-4f4c-939c-bc2943aa01cf","Type":"ContainerStarted","Data":"af35f02ad74178c444c6d3e62fa0ac1fd9ae62bf2b48b32835842a4de25a3219"} Apr 22 18:37:54.549702 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.549672 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-151.ec2.internal" event={"ID":"0786cfcb4c5f79e61aa898a7fed986dc","Type":"ContainerStarted","Data":"0da2f4c9e87717d4dee357d73af26b6ff4b57790cb8ef09b003fdc0468a26a2a"} Apr 22 18:37:54.550646 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.550618 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kh2qh" event={"ID":"96f11a65-b703-4a16-911b-e932a7067c05","Type":"ContainerStarted","Data":"4784d2f70929d3ea4f1ae0cb734dbb9d82cc2cb5728b5748ac0f501624264e96"} Apr 22 18:37:54.551500 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.551477 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8vfmp" event={"ID":"5cac15d4-0328-41c3-8bbc-e6d020fb09d2","Type":"ContainerStarted","Data":"1b4d58181d39152db07bcaec06100f47532a2b5018a44942e0a7ff1cd8604aa9"} Apr 22 18:37:54.552276 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.552256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" event={"ID":"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da","Type":"ContainerStarted","Data":"43f7f2d2149dc2c05236440a113840ad37914d2cb325c32a29a4f41fc21b4c82"} Apr 22 18:37:54.566164 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.566125 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-151.ec2.internal" podStartSLOduration=1.56611307 podStartE2EDuration="1.56611307s" podCreationTimestamp="2026-04-22 18:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:37:54.565812811 +0000 UTC m=+3.556079440" watchObservedRunningTime="2026-04-22 18:37:54.56611307 +0000 UTC m=+3.556379699" Apr 22 18:37:54.779340 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:54.779045 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:55.037446 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:55.037333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:37:55.037615 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:55.037542 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:55.037615 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:55.037610 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs podName:9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f nodeName:}" failed. No retries permitted until 2026-04-22 18:37:57.03759009 +0000 UTC m=+6.027856778 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs") pod "network-metrics-daemon-bkhdr" (UID: "9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:55.138178 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:55.138144 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7nz\" (UniqueName: \"kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz\") pod \"network-check-target-fqfx5\" (UID: \"33339752-7b6a-4821-a16d-b079144b080d\") " pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:37:55.138337 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:55.138289 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:55.138337 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:55.138303 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:55.138337 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:55.138312 2578 projected.go:194] Error preparing data for projected volume kube-api-access-zq7nz for pod openshift-network-diagnostics/network-check-target-fqfx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:55.138518 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:55.138356 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz podName:33339752-7b6a-4821-a16d-b079144b080d nodeName:}" failed. No retries permitted until 2026-04-22 18:37:57.138342507 +0000 UTC m=+6.128609114 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zq7nz" (UniqueName: "kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz") pod "network-check-target-fqfx5" (UID: "33339752-7b6a-4821-a16d-b079144b080d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:55.535483 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:55.535428 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:37:55.535950 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:55.535591 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:37:55.536098 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:55.536070 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:37:55.536202 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:55.536164 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:37:55.565175 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:55.565130 2578 generic.go:358] "Generic (PLEG): container finished" podID="43f86013446addac1ce20d0896a63cc3" containerID="72a485eddc3a164da7df6a847b5dd3017436a434cb9fd008341b5ad00e399734" exitCode=0 Apr 22 18:37:55.565342 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:55.565216 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" event={"ID":"43f86013446addac1ce20d0896a63cc3","Type":"ContainerDied","Data":"72a485eddc3a164da7df6a847b5dd3017436a434cb9fd008341b5ad00e399734"} Apr 22 18:37:56.588496 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:56.587797 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" event={"ID":"43f86013446addac1ce20d0896a63cc3","Type":"ContainerStarted","Data":"c548ff2fe34832194b94664a8f6b3c1bd39c83d65cb4ef203d04ee77bb094965"} Apr 22 18:37:56.606867 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:56.606811 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-151.ec2.internal" podStartSLOduration=3.6067949500000003 podStartE2EDuration="3.60679495s" podCreationTimestamp="2026-04-22 18:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:37:56.606023 +0000 UTC m=+5.596289631" watchObservedRunningTime="2026-04-22 18:37:56.60679495 +0000 UTC m=+5.597061574" Apr 22 18:37:57.058417 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:57.057774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:37:57.058417 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:57.057942 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:57.058417 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:57.058006 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs podName:9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:01.057986873 +0000 UTC m=+10.048253481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs") pod "network-metrics-daemon-bkhdr" (UID: "9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:57.159661 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:57.159024 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7nz\" (UniqueName: \"kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz\") pod \"network-check-target-fqfx5\" (UID: \"33339752-7b6a-4821-a16d-b079144b080d\") " pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:37:57.159661 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:57.159223 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:57.159661 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:57.159241 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:57.159661 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:57.159254 2578 projected.go:194] Error preparing data for projected volume kube-api-access-zq7nz for pod openshift-network-diagnostics/network-check-target-fqfx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:57.159661 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:57.159307 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz podName:33339752-7b6a-4821-a16d-b079144b080d nodeName:}" failed. No retries permitted until 2026-04-22 18:38:01.159289771 +0000 UTC m=+10.149556385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zq7nz" (UniqueName: "kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz") pod "network-check-target-fqfx5" (UID: "33339752-7b6a-4821-a16d-b079144b080d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:57.534762 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:57.534675 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:37:57.534922 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:57.534854 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:37:57.534984 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:57.534956 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:37:57.535091 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:57.535068 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:37:59.535763 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:59.535254 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:37:59.535763 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:37:59.535254 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:37:59.535763 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:59.535388 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:37:59.535763 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:37:59.535517 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:01.091627 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:01.091583 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:01.092105 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:01.091782 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:38:01.092105 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:01.091854 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs podName:9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:09.091833949 +0000 UTC m=+18.082100561 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs") pod "network-metrics-daemon-bkhdr" (UID: "9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:38:01.192840 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:01.192192 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7nz\" (UniqueName: \"kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz\") pod \"network-check-target-fqfx5\" (UID: \"33339752-7b6a-4821-a16d-b079144b080d\") " pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:01.192840 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:01.192386 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:38:01.192840 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:01.192408 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:38:01.192840 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:01.192441 2578 projected.go:194] Error preparing data for projected volume kube-api-access-zq7nz for pod openshift-network-diagnostics/network-check-target-fqfx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:38:01.192840 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:01.192529 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz podName:33339752-7b6a-4821-a16d-b079144b080d nodeName:}" failed. No retries permitted until 2026-04-22 18:38:09.192508842 +0000 UTC m=+18.182775454 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zq7nz" (UniqueName: "kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz") pod "network-check-target-fqfx5" (UID: "33339752-7b6a-4821-a16d-b079144b080d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:38:01.535966 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:01.535505 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:01.535966 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:01.535613 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:38:01.535966 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:01.535791 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:01.535966 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:01.535887 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:03.535071 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:03.534964 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:03.535071 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:03.534992 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:03.535589 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:03.535098 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:38:03.535589 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:03.535180 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:05.534570 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:05.534533 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:05.534989 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:05.534672 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:38:05.534989 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:05.534735 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:05.534989 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:05.534862 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:07.535383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:07.535335 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:07.535383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:07.535382 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:07.535935 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:07.535496 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:07.535935 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:07.535629 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:38:09.152228 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:09.152182 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:09.152711 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:09.152323 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:38:09.152711 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:09.152408 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs podName:9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:25.152387447 +0000 UTC m=+34.142654073 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs") pod "network-metrics-daemon-bkhdr" (UID: "9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:38:09.253296 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:09.253242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7nz\" (UniqueName: \"kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz\") pod \"network-check-target-fqfx5\" (UID: \"33339752-7b6a-4821-a16d-b079144b080d\") " pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:09.253484 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:09.253403 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:38:09.253484 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:09.253430 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:38:09.253484 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:09.253442 2578 projected.go:194] Error preparing data for projected volume kube-api-access-zq7nz for pod openshift-network-diagnostics/network-check-target-fqfx5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:38:09.253621 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:09.253508 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz podName:33339752-7b6a-4821-a16d-b079144b080d nodeName:}" failed. No retries permitted until 2026-04-22 18:38:25.253492954 +0000 UTC m=+34.243759561 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zq7nz" (UniqueName: "kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz") pod "network-check-target-fqfx5" (UID: "33339752-7b6a-4821-a16d-b079144b080d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:38:09.535058 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:09.534974 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:09.535222 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:09.534974 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:09.535222 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:09.535102 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:09.535222 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:09.535175 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:38:11.535252 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.535094 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:11.535977 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:11.535312 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:38:11.535977 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.535203 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:11.535977 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:11.535412 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:11.612087 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.612053 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sw4px" event={"ID":"8e2442ec-04ff-4f4c-939c-bc2943aa01cf","Type":"ContainerStarted","Data":"5bef7096966287489af73978e863b94188971600c48726172f4a1b4a9474e151"} Apr 22 18:38:11.613184 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.613163 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kh2qh" event={"ID":"96f11a65-b703-4a16-911b-e932a7067c05","Type":"ContainerStarted","Data":"7157b395c202bf039dfdf98513fafb3c2c7593173021780fa92767ffaec20809"} Apr 22 18:38:11.614417 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.614395 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8vfmp" event={"ID":"5cac15d4-0328-41c3-8bbc-e6d020fb09d2","Type":"ContainerStarted","Data":"4841874828a1af4435100f298741119d057f69873296845e5a53c2a845f09cf0"} Apr 22 18:38:11.615553 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.615534 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" event={"ID":"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da","Type":"ContainerStarted","Data":"f123043ac6a3240d15bfa76c43f6f1b3a1190282cfcdd0db184a1b9b32621dd6"} Apr 22 18:38:11.616978 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.616960 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-772rv" event={"ID":"f748f72b-69ca-4686-b516-08b6a1e0e7a1","Type":"ContainerStarted","Data":"3125338eefd020a3ceb4dd1e31759377d8411b8ab82643a7534e2d6848e3c5c7"} Apr 22 18:38:11.618099 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.618078 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-htng9" event={"ID":"b0d1706a-8c90-4a85-abfe-07cf53827364","Type":"ContainerStarted","Data":"c61ee783a675aee1d12099ffa488912f4b62b331c6b4f55f566dd83509cb0e66"} Apr 22 18:38:11.619297 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.619270 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" event={"ID":"14a4227e-0dba-42ae-a250-c23eb012b836","Type":"ContainerStarted","Data":"435d372dd7691ccdd303a022aa9c840ebd633af51a06efe448b976528de1f84e"} Apr 22 18:38:11.620427 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.620408 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gjdjf" event={"ID":"940a66b8-963f-4a92-adee-fd47c60355d9","Type":"ContainerStarted","Data":"fe5154514ac6ca5d65953d36ea352f88e21d4385b0e3cbed563dcd15cc3b7d58"} Apr 22 18:38:11.631644 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.631604 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-sw4px" podStartSLOduration=3.652186599 podStartE2EDuration="20.631592599s" podCreationTimestamp="2026-04-22 18:37:51 +0000 UTC" firstStartedPulling="2026-04-22 18:37:54.25007775 +0000 UTC m=+3.240344358" lastFinishedPulling="2026-04-22 18:38:11.229483745 +0000 UTC m=+20.219750358" observedRunningTime="2026-04-22 18:38:11.631097827 +0000 UTC m=+20.621364456" watchObservedRunningTime="2026-04-22 18:38:11.631592599 +0000 UTC m=+20.621859224" Apr 22 18:38:11.687880 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.687839 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kh2qh" podStartSLOduration=3.6771990519999997 podStartE2EDuration="20.687823941s" podCreationTimestamp="2026-04-22 18:37:51 +0000 UTC" firstStartedPulling="2026-04-22 18:37:54.264485259 +0000 UTC m=+3.254751876" lastFinishedPulling="2026-04-22 18:38:11.275110157 +0000 UTC m=+20.265376765" observedRunningTime="2026-04-22 18:38:11.671253773 +0000 UTC m=+20.661520403" watchObservedRunningTime="2026-04-22 18:38:11.687823941 +0000 UTC m=+20.678090569" Apr 22 18:38:11.688149 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.688130 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8vfmp" podStartSLOduration=8.309198746 podStartE2EDuration="20.688124209s" podCreationTimestamp="2026-04-22 18:37:51 +0000 UTC" firstStartedPulling="2026-04-22 18:37:54.263476867 +0000 UTC m=+3.253743474" lastFinishedPulling="2026-04-22 18:38:06.642402327 +0000 UTC m=+15.632668937" observedRunningTime="2026-04-22 18:38:11.687579223 +0000 UTC m=+20.677845852" watchObservedRunningTime="2026-04-22 18:38:11.688124209 +0000 UTC m=+20.678390839" Apr 22 18:38:11.702734 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.702489 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-htng9" podStartSLOduration=3.7413779959999998 podStartE2EDuration="20.702442852s" podCreationTimestamp="2026-04-22 18:37:51 +0000 UTC" firstStartedPulling="2026-04-22 18:37:54.268085007 +0000 UTC m=+3.258351618" lastFinishedPulling="2026-04-22 18:38:11.229149861 +0000 UTC m=+20.219416474" observedRunningTime="2026-04-22 18:38:11.701551266 +0000 UTC m=+20.691817895" watchObservedRunningTime="2026-04-22 18:38:11.702442852 +0000 UTC m=+20.692709483" Apr 22 18:38:11.716264 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:11.716217 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-gjdjf" podStartSLOduration=3.75058193 podStartE2EDuration="20.716201921s" podCreationTimestamp="2026-04-22 18:37:51 +0000 UTC" firstStartedPulling="2026-04-22 18:37:54.263871624 +0000 UTC m=+3.254138244" lastFinishedPulling="2026-04-22 18:38:11.229491629 +0000 UTC m=+20.219758235" observedRunningTime="2026-04-22 18:38:11.71584065 +0000 UTC m=+20.706107279" watchObservedRunningTime="2026-04-22 18:38:11.716201921 +0000 UTC m=+20.706468550" Apr 22 18:38:12.555399 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:12.555245 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:38:12.623573 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:12.623524 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" event={"ID":"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da","Type":"ContainerStarted","Data":"bde94bd0eb40076fb0b971346e620851b01d55861907bbcde4a2cbf1ae971cad"} Apr 22 18:38:12.624638 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:12.624611 2578 generic.go:358] "Generic (PLEG): container finished" podID="f748f72b-69ca-4686-b516-08b6a1e0e7a1" containerID="3125338eefd020a3ceb4dd1e31759377d8411b8ab82643a7534e2d6848e3c5c7" exitCode=0 Apr 22 18:38:12.624737 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:12.624700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-772rv" event={"ID":"f748f72b-69ca-4686-b516-08b6a1e0e7a1","Type":"ContainerDied","Data":"3125338eefd020a3ceb4dd1e31759377d8411b8ab82643a7534e2d6848e3c5c7"} Apr 22 18:38:12.627206 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:12.627187 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 18:38:12.627539 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:12.627521 2578 generic.go:358] "Generic (PLEG): container finished" podID="14a4227e-0dba-42ae-a250-c23eb012b836" containerID="e2f07cf7e92fb8e720e1624dcedb1b3ff919835f3a280543a2d2b007e70bc7f9" exitCode=1 Apr 22 18:38:12.627667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:12.627625 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" event={"ID":"14a4227e-0dba-42ae-a250-c23eb012b836","Type":"ContainerStarted","Data":"43f195333c34a42d6b33e7dbb9f270f23676c105f927b7372dc9c5326a2a3682"} Apr 22 18:38:12.627667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:12.627662 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" event={"ID":"14a4227e-0dba-42ae-a250-c23eb012b836","Type":"ContainerStarted","Data":"d6c1b93c5a62b78f1117ffeae8864bc6cfbbd309e15d8e469390856d56588348"} Apr 22 18:38:12.627744 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:12.627679 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" event={"ID":"14a4227e-0dba-42ae-a250-c23eb012b836","Type":"ContainerStarted","Data":"fef6c19abd3d2970f6e26be8648e91f79ec8d0816fe5e5e5ca89dca8e020d76b"} Apr 22 18:38:12.627744 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:12.627694 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" event={"ID":"14a4227e-0dba-42ae-a250-c23eb012b836","Type":"ContainerStarted","Data":"24d37caf440b0de534f14fb985372da0699b63b43065628624b4db566750e9ef"} Apr 22 18:38:12.627744 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:12.627709 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" event={"ID":"14a4227e-0dba-42ae-a250-c23eb012b836","Type":"ContainerDied","Data":"e2f07cf7e92fb8e720e1624dcedb1b3ff919835f3a280543a2d2b007e70bc7f9"} Apr 22 18:38:13.476050 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:13.475951 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:38:12.555393369Z","UUID":"6f7e4c6a-454d-4379-a2cb-9e318266294e","Handler":null,"Name":"","Endpoint":""} Apr 22 18:38:13.477769 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:13.477746 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:38:13.477769 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:13.477777 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:38:13.534997 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:13.534966 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:13.535177 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:13.534967 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:13.535177 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:13.535097 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:13.535177 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:13.535154 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:38:13.631166 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:13.631129 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c2nqs" event={"ID":"b331f320-606f-47f7-835d-668250b816bc","Type":"ContainerStarted","Data":"901fb62dda9df353f4ee2d30d638ff83f0967ff1a85bc0946133423426ccab39"} Apr 22 18:38:13.647630 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:13.647580 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-c2nqs" podStartSLOduration=5.67923001 podStartE2EDuration="22.647565305s" podCreationTimestamp="2026-04-22 18:37:51 +0000 UTC" firstStartedPulling="2026-04-22 18:37:54.260820104 +0000 UTC m=+3.251086726" lastFinishedPulling="2026-04-22 18:38:11.229155413 +0000 UTC m=+20.219422021" observedRunningTime="2026-04-22 18:38:13.646958951 +0000 UTC m=+22.637225579" watchObservedRunningTime="2026-04-22 18:38:13.647565305 +0000 UTC m=+22.637831933" Apr 22 18:38:14.635867 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:14.635769 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" event={"ID":"71f5fcb3-52cb-4b9a-acbe-8b62fc2d17da","Type":"ContainerStarted","Data":"d72bd786a0379f66ef1bb2ca6b86aa2c8af165e952ad78da9f27c463a8e2ec2b"} Apr 22 18:38:14.639197 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:14.639176 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 18:38:14.639601 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:14.639575 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" event={"ID":"14a4227e-0dba-42ae-a250-c23eb012b836","Type":"ContainerStarted","Data":"ecef776aec07e5b8e22c04714cd571f7412c26c904a605c27a5fda31dc7634ca"} Apr 22 18:38:14.655624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:14.655565 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dmvm" podStartSLOduration=4.198839198 podStartE2EDuration="23.655547248s" podCreationTimestamp="2026-04-22 18:37:51 +0000 UTC" firstStartedPulling="2026-04-22 18:37:54.259366757 +0000 UTC m=+3.249633364" lastFinishedPulling="2026-04-22 18:38:13.716074803 +0000 UTC m=+22.706341414" observedRunningTime="2026-04-22 18:38:14.65551313 +0000 UTC m=+23.645779759" watchObservedRunningTime="2026-04-22 18:38:14.655547248 +0000 UTC m=+23.645813878" Apr 22 18:38:15.534427 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:15.534390 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:15.534621 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:15.534395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:15.534621 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:15.534582 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:15.534730 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:15.534672 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:38:16.546574 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:16.546522 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-gjdjf" Apr 22 18:38:16.547358 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:16.547341 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-gjdjf" Apr 22 18:38:16.644233 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:16.644190 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-gjdjf" Apr 22 18:38:16.644819 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:16.644798 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-gjdjf" Apr 22 18:38:17.534771 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:17.534593 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:17.534939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:17.534614 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:17.534939 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:17.534851 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:38:17.534939 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:17.534913 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:17.647514 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:17.647456 2578 generic.go:358] "Generic (PLEG): container finished" podID="f748f72b-69ca-4686-b516-08b6a1e0e7a1" containerID="9ba923b9b53b2a2fb85117dda4d0232ab44d8cb14d17c0c0206e1b3bfe7c2f13" exitCode=0 Apr 22 18:38:17.647982 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:17.647552 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-772rv" event={"ID":"f748f72b-69ca-4686-b516-08b6a1e0e7a1","Type":"ContainerDied","Data":"9ba923b9b53b2a2fb85117dda4d0232ab44d8cb14d17c0c0206e1b3bfe7c2f13"} Apr 22 18:38:17.650664 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:17.650645 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 18:38:17.650966 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:17.650939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" event={"ID":"14a4227e-0dba-42ae-a250-c23eb012b836","Type":"ContainerStarted","Data":"20ebc9f81cefcfe8c0ed228b4acd3e5b3713db4dd21e4c6dacfc3b596f707fdb"} Apr 22 18:38:17.651399 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:17.651287 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:38:17.651399 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:17.651371 2578 scope.go:117] "RemoveContainer" containerID="e2f07cf7e92fb8e720e1624dcedb1b3ff919835f3a280543a2d2b007e70bc7f9" Apr 22 18:38:17.666755 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:17.666733 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:38:18.408427 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:18.408387 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:38:18.658061 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:18.658040 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 18:38:18.658795 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:18.658613 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" event={"ID":"14a4227e-0dba-42ae-a250-c23eb012b836","Type":"ContainerStarted","Data":"8cf04e4c4bae856485afc26b120fffbe2d009ecf3b1c395e77fc890d9ce1c62b"} Apr 22 18:38:18.659104 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:18.659063 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:38:18.661563 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:18.661489 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-772rv" event={"ID":"f748f72b-69ca-4686-b516-08b6a1e0e7a1","Type":"ContainerStarted","Data":"641e8b7a4ba400bca1f754651dc8cf29821b77f1e8448144171589d1d63dc993"} Apr 22 18:38:18.676271 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:18.676240 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:38:18.688512 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:18.688445 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" podStartSLOduration=10.648376379 podStartE2EDuration="27.688430498s" podCreationTimestamp="2026-04-22 18:37:51 +0000 UTC" firstStartedPulling="2026-04-22 18:37:54.265669819 +0000 UTC m=+3.255936439" lastFinishedPulling="2026-04-22 18:38:11.305723935 +0000 UTC m=+20.295990558" observedRunningTime="2026-04-22 18:38:18.686879167 +0000 UTC m=+27.677145821" watchObservedRunningTime="2026-04-22 18:38:18.688430498 +0000 UTC m=+27.678697126" Apr 22 18:38:18.873599 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:18.873558 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fqfx5"] Apr 22 18:38:18.873760 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:18.873677 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:18.873811 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:18.873757 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:38:18.875988 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:18.875967 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bkhdr"] Apr 22 18:38:18.876163 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:18.876064 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:18.876163 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:18.876145 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:19.665062 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:19.665027 2578 generic.go:358] "Generic (PLEG): container finished" podID="f748f72b-69ca-4686-b516-08b6a1e0e7a1" containerID="641e8b7a4ba400bca1f754651dc8cf29821b77f1e8448144171589d1d63dc993" exitCode=0 Apr 22 18:38:19.665492 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:19.665102 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-772rv" event={"ID":"f748f72b-69ca-4686-b516-08b6a1e0e7a1","Type":"ContainerDied","Data":"641e8b7a4ba400bca1f754651dc8cf29821b77f1e8448144171589d1d63dc993"} Apr 22 18:38:20.534388 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:20.534298 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:20.534573 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:20.534415 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:38:20.534573 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:20.534449 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:20.534573 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:20.534528 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:20.669117 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:20.669085 2578 generic.go:358] "Generic (PLEG): container finished" podID="f748f72b-69ca-4686-b516-08b6a1e0e7a1" containerID="7c2cd7d53d5d071c2bde0944deacd264cc4e04da8a881026a5f23953c59535bf" exitCode=0 Apr 22 18:38:20.669579 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:20.669154 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-772rv" event={"ID":"f748f72b-69ca-4686-b516-08b6a1e0e7a1","Type":"ContainerDied","Data":"7c2cd7d53d5d071c2bde0944deacd264cc4e04da8a881026a5f23953c59535bf"} Apr 22 18:38:22.534926 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:22.534888 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:22.534926 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:22.534909 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:22.535600 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:22.535020 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fqfx5" podUID="33339752-7b6a-4821-a16d-b079144b080d" Apr 22 18:38:22.535600 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:22.535175 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bkhdr" podUID="9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f" Apr 22 18:38:24.333766 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.333684 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-151.ec2.internal" event="NodeReady" Apr 22 18:38:24.334243 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.333837 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:38:24.380814 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.380781 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sv8gg"] Apr 22 18:38:24.384832 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.384801 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xpp6f"] Apr 22 18:38:24.384993 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.384977 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:24.387408 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.387381 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:38:24.387560 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.387423 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cfw2h\"" Apr 22 18:38:24.387560 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.387510 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:38:24.387942 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.387922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:38:24.390114 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.389940 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:38:24.390114 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.390111 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:38:24.390370 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.390350 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rz76n\"" Apr 22 18:38:24.390370 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.390380 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:38:24.393937 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.393650 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sv8gg"] Apr 22 18:38:24.396027 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.396003 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xpp6f"] Apr 22 18:38:24.535117 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.535081 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:24.535117 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.535108 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:24.538301 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.538050 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:38:24.538301 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.538050 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:38:24.538301 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.538141 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-djgkc\"" Apr 22 18:38:24.538301 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.538163 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-stzzq\"" Apr 22 18:38:24.538301 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.538182 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:38:24.573696 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.573656 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-config-volume\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:24.573881 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.573719 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6c22\" (UniqueName: \"kubernetes.io/projected/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-kube-api-access-v6c22\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:24.573881 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.573757 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:38:24.573881 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.573780 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:24.573881 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.573846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqbz4\" (UniqueName: \"kubernetes.io/projected/c91e57b3-5752-444f-9b93-89af6a4673a4-kube-api-access-mqbz4\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:38:24.574073 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.573869 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-tmp-dir\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:24.675012 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.674969 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-config-volume\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:24.675223 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.675038 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6c22\" (UniqueName: \"kubernetes.io/projected/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-kube-api-access-v6c22\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:24.675223 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.675064 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:38:24.675223 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.675081 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:24.675223 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.675148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqbz4\" (UniqueName: \"kubernetes.io/projected/c91e57b3-5752-444f-9b93-89af6a4673a4-kube-api-access-mqbz4\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:38:24.675223 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.675175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-tmp-dir\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:24.675563 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:24.675222 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:24.675563 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:24.675294 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert podName:c91e57b3-5752-444f-9b93-89af6a4673a4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:25.175271874 +0000 UTC m=+34.165538491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert") pod "ingress-canary-xpp6f" (UID: "c91e57b3-5752-444f-9b93-89af6a4673a4") : secret "canary-serving-cert" not found Apr 22 18:38:24.675563 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:24.675370 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:24.675563 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:24.675428 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls podName:eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:25.175410256 +0000 UTC m=+34.165676867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls") pod "dns-default-sv8gg" (UID: "eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4") : secret "dns-default-metrics-tls" not found Apr 22 18:38:24.675563 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.675478 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-tmp-dir\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:24.675791 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.675685 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-config-volume\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:24.686694 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.686664 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6c22\" (UniqueName: \"kubernetes.io/projected/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-kube-api-access-v6c22\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:24.686837 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:24.686741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqbz4\" (UniqueName: \"kubernetes.io/projected/c91e57b3-5752-444f-9b93-89af6a4673a4-kube-api-access-mqbz4\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:38:25.178608 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:25.178571 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:38:25.178608 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:25.178608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:25.178861 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:25.178660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:25.178861 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:25.178740 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:25.178861 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:25.178768 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:38:25.178861 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:25.178775 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:25.178861 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:25.178816 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert podName:c91e57b3-5752-444f-9b93-89af6a4673a4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:26.178796689 +0000 UTC m=+35.169063316 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert") pod "ingress-canary-xpp6f" (UID: "c91e57b3-5752-444f-9b93-89af6a4673a4") : secret "canary-serving-cert" not found Apr 22 18:38:25.178861 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:25.178833 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs podName:9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:57.178826856 +0000 UTC m=+66.169093463 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs") pod "network-metrics-daemon-bkhdr" (UID: "9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f") : secret "metrics-daemon-secret" not found Apr 22 18:38:25.178861 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:25.178848 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls podName:eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:26.17883924 +0000 UTC m=+35.169105847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls") pod "dns-default-sv8gg" (UID: "eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4") : secret "dns-default-metrics-tls" not found Apr 22 18:38:25.279993 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:25.279956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7nz\" (UniqueName: \"kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz\") pod \"network-check-target-fqfx5\" (UID: \"33339752-7b6a-4821-a16d-b079144b080d\") " pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:25.282799 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:25.282767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq7nz\" (UniqueName: \"kubernetes.io/projected/33339752-7b6a-4821-a16d-b079144b080d-kube-api-access-zq7nz\") pod \"network-check-target-fqfx5\" (UID: \"33339752-7b6a-4821-a16d-b079144b080d\") " pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:25.452015 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:25.451925 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:26.186444 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:26.186405 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:38:26.186444 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:26.186447 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:26.186704 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:26.186581 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:26.186704 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:26.186591 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:26.186704 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:26.186637 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls podName:eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:28.186622104 +0000 UTC m=+37.176888731 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls") pod "dns-default-sv8gg" (UID: "eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4") : secret "dns-default-metrics-tls" not found Apr 22 18:38:26.186704 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:26.186651 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert podName:c91e57b3-5752-444f-9b93-89af6a4673a4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:28.186645326 +0000 UTC m=+37.176911933 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert") pod "ingress-canary-xpp6f" (UID: "c91e57b3-5752-444f-9b93-89af6a4673a4") : secret "canary-serving-cert" not found Apr 22 18:38:26.261034 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:26.260997 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fqfx5"] Apr 22 18:38:26.325092 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:38:26.325054 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33339752_7b6a_4821_a16d_b079144b080d.slice/crio-85e7398dd9be65900796e0a4cdf4fe649357d4663dd2ab97ca026987038f7021 WatchSource:0}: Error finding container 85e7398dd9be65900796e0a4cdf4fe649357d4663dd2ab97ca026987038f7021: Status 404 returned error can't find the container with id 85e7398dd9be65900796e0a4cdf4fe649357d4663dd2ab97ca026987038f7021 Apr 22 18:38:26.684423 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:26.684384 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fqfx5" event={"ID":"33339752-7b6a-4821-a16d-b079144b080d","Type":"ContainerStarted","Data":"85e7398dd9be65900796e0a4cdf4fe649357d4663dd2ab97ca026987038f7021"} Apr 22 18:38:26.686819 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:26.686789 2578 generic.go:358] "Generic (PLEG): container finished" podID="f748f72b-69ca-4686-b516-08b6a1e0e7a1" containerID="f1aa0a2b88e12314d3e9b0905c1715a6f62b89bad1b343d6143686bac049c36d" exitCode=0 Apr 22 18:38:26.686921 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:26.686820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-772rv" event={"ID":"f748f72b-69ca-4686-b516-08b6a1e0e7a1","Type":"ContainerDied","Data":"f1aa0a2b88e12314d3e9b0905c1715a6f62b89bad1b343d6143686bac049c36d"} Apr 22 18:38:27.692318 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:27.692135 2578 generic.go:358] "Generic (PLEG): container finished" podID="f748f72b-69ca-4686-b516-08b6a1e0e7a1" containerID="b742aafbb3536e46715dcc47a99884e9eea0b21201524b106b274f3547897d75" exitCode=0 Apr 22 18:38:27.692318 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:27.692217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-772rv" event={"ID":"f748f72b-69ca-4686-b516-08b6a1e0e7a1","Type":"ContainerDied","Data":"b742aafbb3536e46715dcc47a99884e9eea0b21201524b106b274f3547897d75"} Apr 22 18:38:28.202057 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:28.202020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:38:28.202057 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:28.202065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:28.202285 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:28.202189 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:28.202285 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:28.202191 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:28.202389 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:28.202251 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls podName:eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:32.202232545 +0000 UTC m=+41.192499169 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls") pod "dns-default-sv8gg" (UID: "eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4") : secret "dns-default-metrics-tls" not found Apr 22 18:38:28.202389 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:28.202330 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert podName:c91e57b3-5752-444f-9b93-89af6a4673a4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:32.202309291 +0000 UTC m=+41.192575912 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert") pod "ingress-canary-xpp6f" (UID: "c91e57b3-5752-444f-9b93-89af6a4673a4") : secret "canary-serving-cert" not found Apr 22 18:38:28.697823 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:28.697793 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-772rv" event={"ID":"f748f72b-69ca-4686-b516-08b6a1e0e7a1","Type":"ContainerStarted","Data":"1a3112d69ea88c84924fde9d0caed91d376f482131c3b1e6fb843ce43e8370f5"} Apr 22 18:38:28.725542 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:28.725488 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-772rv" podStartSLOduration=5.640061204 podStartE2EDuration="37.725452852s" podCreationTimestamp="2026-04-22 18:37:51 +0000 UTC" firstStartedPulling="2026-04-22 18:37:54.268276992 +0000 UTC m=+3.258543600" lastFinishedPulling="2026-04-22 18:38:26.353668641 +0000 UTC m=+35.343935248" observedRunningTime="2026-04-22 18:38:28.719994849 +0000 UTC m=+37.710261478" watchObservedRunningTime="2026-04-22 18:38:28.725452852 +0000 UTC m=+37.715719480" Apr 22 18:38:29.701425 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:29.701386 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fqfx5" event={"ID":"33339752-7b6a-4821-a16d-b079144b080d","Type":"ContainerStarted","Data":"ac7f4905f3d12699cc7f693b2282e926df17be187c3546d6e83f68e7d96705f6"} Apr 22 18:38:29.701843 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:29.701767 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:38:29.717719 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:29.717628 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fqfx5" podStartSLOduration=35.581301151 podStartE2EDuration="38.717615539s" podCreationTimestamp="2026-04-22 18:37:51 +0000 UTC" firstStartedPulling="2026-04-22 18:38:26.331986841 +0000 UTC m=+35.322253452" lastFinishedPulling="2026-04-22 18:38:29.468301224 +0000 UTC m=+38.458567840" observedRunningTime="2026-04-22 18:38:29.716967131 +0000 UTC m=+38.707233784" watchObservedRunningTime="2026-04-22 18:38:29.717615539 +0000 UTC m=+38.707882168" Apr 22 18:38:32.230698 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:32.230661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:38:32.230698 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:32.230695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:32.231163 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:32.230791 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:32.231163 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:32.230809 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:32.231163 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:32.230847 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls podName:eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:40.23083034 +0000 UTC m=+49.221096947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls") pod "dns-default-sv8gg" (UID: "eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4") : secret "dns-default-metrics-tls" not found Apr 22 18:38:32.231163 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:32.230872 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert podName:c91e57b3-5752-444f-9b93-89af6a4673a4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:40.230855119 +0000 UTC m=+49.221121726 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert") pod "ingress-canary-xpp6f" (UID: "c91e57b3-5752-444f-9b93-89af6a4673a4") : secret "canary-serving-cert" not found Apr 22 18:38:40.284955 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:40.284918 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:38:40.284955 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:40.284958 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:40.285497 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:40.285078 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:40.285497 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:40.285078 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:40.285497 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:40.285132 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls podName:eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:56.285117119 +0000 UTC m=+65.275383725 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls") pod "dns-default-sv8gg" (UID: "eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4") : secret "dns-default-metrics-tls" not found Apr 22 18:38:40.285497 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:40.285145 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert podName:c91e57b3-5752-444f-9b93-89af6a4673a4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:56.28513905 +0000 UTC m=+65.275405658 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert") pod "ingress-canary-xpp6f" (UID: "c91e57b3-5752-444f-9b93-89af6a4673a4") : secret "canary-serving-cert" not found Apr 22 18:38:48.224154 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.224118 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk"] Apr 22 18:38:48.266455 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.266415 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk"] Apr 22 18:38:48.266635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.266554 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:48.269976 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.269152 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 18:38:48.271644 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.271616 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:38:48.271806 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.271692 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:38:48.271806 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.271746 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 18:38:48.271806 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.271762 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-zclrv\"" Apr 22 18:38:48.331975 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.331935 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtbjf"] Apr 22 18:38:48.340809 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.340772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:48.340974 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.340818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9mbk\" (UniqueName: \"kubernetes.io/projected/2812ef50-c4e4-42a2-940f-01dd5bf968d4-kube-api-access-x9mbk\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:48.340974 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.340875 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2812ef50-c4e4-42a2-940f-01dd5bf968d4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:48.356070 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.356040 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r"] Apr 22 18:38:48.356243 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.356185 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtbjf" Apr 22 18:38:48.358712 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.358688 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:38:48.358852 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.358689 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-wp88d\"" Apr 22 18:38:48.358852 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.358688 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 18:38:48.373760 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.373729 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-9d4498d6c-bkgtv"] Apr 22 18:38:48.373906 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.373875 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:38:48.376369 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.376345 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-d5wh4\"" Apr 22 18:38:48.376549 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.376531 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 18:38:48.376620 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.376538 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 18:38:48.376659 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.376554 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:38:48.394307 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.394273 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtbjf"] Apr 22 18:38:48.394307 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.394303 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r"] Apr 22 18:38:48.394307 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.394315 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9d4498d6c-bkgtv"] Apr 22 18:38:48.394544 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.394409 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.396763 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.396735 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:38:48.396904 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.396764 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:38:48.396904 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.396738 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:38:48.396904 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.396837 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fbmmk\"" Apr 22 18:38:48.402030 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.402007 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:38:48.441883 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.441854 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-bound-sa-token\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.442050 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.441892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef9d655a-2703-42fd-97e9-84bb1fe30e68-trusted-ca\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.442050 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.441945 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:48.442050 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.441970 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcxrt\" (UniqueName: \"kubernetes.io/projected/6ef4308b-cece-496e-a0f7-3f561908b409-kube-api-access-zcxrt\") pod \"cluster-samples-operator-6dc5bdb6b4-p6c6r\" (UID: \"6ef4308b-cece-496e-a0f7-3f561908b409\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:38:48.442050 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.441989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9mbk\" (UniqueName: \"kubernetes.io/projected/2812ef50-c4e4-42a2-940f-01dd5bf968d4-kube-api-access-x9mbk\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:48.442050 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.442021 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ef9d655a-2703-42fd-97e9-84bb1fe30e68-installation-pull-secrets\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.442283 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.442048 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ef9d655a-2703-42fd-97e9-84bb1fe30e68-image-registry-private-configuration\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.442283 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.442091 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.442283 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:48.442092 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:48.442283 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.442117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2812ef50-c4e4-42a2-940f-01dd5bf968d4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:48.442283 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.442134 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ef9d655a-2703-42fd-97e9-84bb1fe30e68-ca-trust-extracted\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.442283 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:48.442208 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls podName:2812ef50-c4e4-42a2-940f-01dd5bf968d4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:48.942182306 +0000 UTC m=+57.932448917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pc6pk" (UID: "2812ef50-c4e4-42a2-940f-01dd5bf968d4") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:48.442283 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.442258 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnxnt\" (UniqueName: \"kubernetes.io/projected/76c4e788-057e-424f-9bb3-f21537fa2489-kube-api-access-qnxnt\") pod \"volume-data-source-validator-7c6cbb6c87-qtbjf\" (UID: \"76c4e788-057e-424f-9bb3-f21537fa2489\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtbjf" Apr 22 18:38:48.442590 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.442335 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-p6c6r\" (UID: \"6ef4308b-cece-496e-a0f7-3f561908b409\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:38:48.442590 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.442368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-certificates\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.442590 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.442394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6fb6\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-kube-api-access-q6fb6\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.442814 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.442797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2812ef50-c4e4-42a2-940f-01dd5bf968d4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:48.475752 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.475667 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9mbk\" (UniqueName: \"kubernetes.io/projected/2812ef50-c4e4-42a2-940f-01dd5bf968d4-kube-api-access-x9mbk\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:48.542810 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.542780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ef9d655a-2703-42fd-97e9-84bb1fe30e68-ca-trust-extracted\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.542947 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.542818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnxnt\" (UniqueName: \"kubernetes.io/projected/76c4e788-057e-424f-9bb3-f21537fa2489-kube-api-access-qnxnt\") pod \"volume-data-source-validator-7c6cbb6c87-qtbjf\" (UID: \"76c4e788-057e-424f-9bb3-f21537fa2489\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtbjf" Apr 22 18:38:48.542947 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.542845 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-p6c6r\" (UID: \"6ef4308b-cece-496e-a0f7-3f561908b409\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:38:48.542947 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.542863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-certificates\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.542947 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.542878 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6fb6\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-kube-api-access-q6fb6\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.542947 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.542923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-bound-sa-token\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.543193 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.542953 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef9d655a-2703-42fd-97e9-84bb1fe30e68-trusted-ca\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.543193 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:48.542973 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:38:48.543193 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.542990 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcxrt\" (UniqueName: \"kubernetes.io/projected/6ef4308b-cece-496e-a0f7-3f561908b409-kube-api-access-zcxrt\") pod \"cluster-samples-operator-6dc5bdb6b4-p6c6r\" (UID: \"6ef4308b-cece-496e-a0f7-3f561908b409\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:38:48.543193 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.543015 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ef9d655a-2703-42fd-97e9-84bb1fe30e68-installation-pull-secrets\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.543193 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:48.543044 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls podName:6ef4308b-cece-496e-a0f7-3f561908b409 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:49.043023197 +0000 UTC m=+58.033289807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-p6c6r" (UID: "6ef4308b-cece-496e-a0f7-3f561908b409") : secret "samples-operator-tls" not found Apr 22 18:38:48.543193 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.543093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ef9d655a-2703-42fd-97e9-84bb1fe30e68-image-registry-private-configuration\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.543193 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.543155 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.543567 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:48.543263 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:48.543567 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:48.543278 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d4498d6c-bkgtv: secret "image-registry-tls" not found Apr 22 18:38:48.543567 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.543278 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ef9d655a-2703-42fd-97e9-84bb1fe30e68-ca-trust-extracted\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.543567 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:48.543320 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls podName:ef9d655a-2703-42fd-97e9-84bb1fe30e68 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:49.043303764 +0000 UTC m=+58.033570376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls") pod "image-registry-9d4498d6c-bkgtv" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68") : secret "image-registry-tls" not found Apr 22 18:38:48.543771 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.543741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-certificates\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.544346 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.544327 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef9d655a-2703-42fd-97e9-84bb1fe30e68-trusted-ca\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.545611 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.545590 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ef9d655a-2703-42fd-97e9-84bb1fe30e68-image-registry-private-configuration\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.545695 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.545678 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ef9d655a-2703-42fd-97e9-84bb1fe30e68-installation-pull-secrets\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.552323 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.552294 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnxnt\" (UniqueName: \"kubernetes.io/projected/76c4e788-057e-424f-9bb3-f21537fa2489-kube-api-access-qnxnt\") pod \"volume-data-source-validator-7c6cbb6c87-qtbjf\" (UID: \"76c4e788-057e-424f-9bb3-f21537fa2489\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtbjf" Apr 22 18:38:48.552587 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.552569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6fb6\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-kube-api-access-q6fb6\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.553002 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.552980 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-bound-sa-token\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:48.553396 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.553379 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcxrt\" (UniqueName: \"kubernetes.io/projected/6ef4308b-cece-496e-a0f7-3f561908b409-kube-api-access-zcxrt\") pod \"cluster-samples-operator-6dc5bdb6b4-p6c6r\" (UID: \"6ef4308b-cece-496e-a0f7-3f561908b409\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:38:48.665864 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.665819 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtbjf" Apr 22 18:38:48.782846 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.782808 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtbjf"] Apr 22 18:38:48.786394 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:38:48.786363 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76c4e788_057e_424f_9bb3_f21537fa2489.slice/crio-262e7947a46ccba305c6fff779c292214d6ee198db2c755cd7e7a79be67a45d8 WatchSource:0}: Error finding container 262e7947a46ccba305c6fff779c292214d6ee198db2c755cd7e7a79be67a45d8: Status 404 returned error can't find the container with id 262e7947a46ccba305c6fff779c292214d6ee198db2c755cd7e7a79be67a45d8 Apr 22 18:38:48.946441 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:48.946402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:48.946604 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:48.946566 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:48.946660 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:48.946636 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls podName:2812ef50-c4e4-42a2-940f-01dd5bf968d4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:49.946617624 +0000 UTC m=+58.936884230 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pc6pk" (UID: "2812ef50-c4e4-42a2-940f-01dd5bf968d4") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:49.047006 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:49.046924 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:49.047006 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:49.046968 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-p6c6r\" (UID: \"6ef4308b-cece-496e-a0f7-3f561908b409\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:38:49.047187 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:49.047074 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:49.047187 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:49.047091 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:38:49.047187 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:49.047097 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d4498d6c-bkgtv: secret "image-registry-tls" not found Apr 22 18:38:49.047187 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:49.047153 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls podName:6ef4308b-cece-496e-a0f7-3f561908b409 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:50.04713355 +0000 UTC m=+59.037400174 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-p6c6r" (UID: "6ef4308b-cece-496e-a0f7-3f561908b409") : secret "samples-operator-tls" not found Apr 22 18:38:49.047187 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:49.047170 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls podName:ef9d655a-2703-42fd-97e9-84bb1fe30e68 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:50.047161996 +0000 UTC m=+59.037428603 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls") pod "image-registry-9d4498d6c-bkgtv" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68") : secret "image-registry-tls" not found Apr 22 18:38:49.739345 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:49.739301 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtbjf" event={"ID":"76c4e788-057e-424f-9bb3-f21537fa2489","Type":"ContainerStarted","Data":"262e7947a46ccba305c6fff779c292214d6ee198db2c755cd7e7a79be67a45d8"} Apr 22 18:38:49.953005 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:49.952965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:49.953195 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:49.953130 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:49.953257 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:49.953199 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls podName:2812ef50-c4e4-42a2-940f-01dd5bf968d4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:51.953181892 +0000 UTC m=+60.943448500 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pc6pk" (UID: "2812ef50-c4e4-42a2-940f-01dd5bf968d4") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:50.053312 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.053282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:50.053451 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.053341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-p6c6r\" (UID: \"6ef4308b-cece-496e-a0f7-3f561908b409\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:38:50.053451 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:50.053420 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:50.053451 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:50.053438 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d4498d6c-bkgtv: secret "image-registry-tls" not found Apr 22 18:38:50.053610 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:50.053513 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls podName:ef9d655a-2703-42fd-97e9-84bb1fe30e68 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:52.053495852 +0000 UTC m=+61.043762459 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls") pod "image-registry-9d4498d6c-bkgtv" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68") : secret "image-registry-tls" not found Apr 22 18:38:50.053610 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:50.053519 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:38:50.053610 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:50.053589 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls podName:6ef4308b-cece-496e-a0f7-3f561908b409 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:52.053570692 +0000 UTC m=+61.043837310 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-p6c6r" (UID: "6ef4308b-cece-496e-a0f7-3f561908b409") : secret "samples-operator-tls" not found Apr 22 18:38:50.533876 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.533840 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-bw86q"] Apr 22 18:38:50.536601 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.536584 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bw86q" Apr 22 18:38:50.539048 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.539028 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-5m5sb\"" Apr 22 18:38:50.544410 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.544383 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-bw86q"] Apr 22 18:38:50.557329 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.557301 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7hdl\" (UniqueName: \"kubernetes.io/projected/6855095a-2036-4c98-8d4f-2bea5b4d8cd6-kube-api-access-s7hdl\") pod \"network-check-source-8894fc9bd-bw86q\" (UID: \"6855095a-2036-4c98-8d4f-2bea5b4d8cd6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bw86q" Apr 22 18:38:50.658396 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.658353 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7hdl\" (UniqueName: \"kubernetes.io/projected/6855095a-2036-4c98-8d4f-2bea5b4d8cd6-kube-api-access-s7hdl\") pod \"network-check-source-8894fc9bd-bw86q\" (UID: \"6855095a-2036-4c98-8d4f-2bea5b4d8cd6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bw86q" Apr 22 18:38:50.667730 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.667697 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7hdl\" (UniqueName: \"kubernetes.io/projected/6855095a-2036-4c98-8d4f-2bea5b4d8cd6-kube-api-access-s7hdl\") pod \"network-check-source-8894fc9bd-bw86q\" (UID: \"6855095a-2036-4c98-8d4f-2bea5b4d8cd6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bw86q" Apr 22 18:38:50.680405 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.680371 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2rx9" Apr 22 18:38:50.742909 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.742872 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtbjf" event={"ID":"76c4e788-057e-424f-9bb3-f21537fa2489","Type":"ContainerStarted","Data":"618d1f1ab0ba81d2496b07aaf7ba1eff558125e5cf7f5d5ed7f4c1d999a60591"} Apr 22 18:38:50.760871 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.760756 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtbjf" podStartSLOduration=1.536807208 podStartE2EDuration="2.76073834s" podCreationTimestamp="2026-04-22 18:38:48 +0000 UTC" firstStartedPulling="2026-04-22 18:38:48.788148513 +0000 UTC m=+57.778415120" lastFinishedPulling="2026-04-22 18:38:50.012079646 +0000 UTC m=+59.002346252" observedRunningTime="2026-04-22 18:38:50.760702985 +0000 UTC m=+59.750969627" watchObservedRunningTime="2026-04-22 18:38:50.76073834 +0000 UTC m=+59.751004970" Apr 22 18:38:50.845767 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.845687 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bw86q" Apr 22 18:38:50.979137 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:50.979107 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-bw86q"] Apr 22 18:38:50.982264 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:38:50.982242 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6855095a_2036_4c98_8d4f_2bea5b4d8cd6.slice/crio-667e2ebda53623dead040352eab934cf79b9a024ea4f6f84a6239019b80ea863 WatchSource:0}: Error finding container 667e2ebda53623dead040352eab934cf79b9a024ea4f6f84a6239019b80ea863: Status 404 returned error can't find the container with id 667e2ebda53623dead040352eab934cf79b9a024ea4f6f84a6239019b80ea863 Apr 22 18:38:51.746802 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:51.746762 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bw86q" event={"ID":"6855095a-2036-4c98-8d4f-2bea5b4d8cd6","Type":"ContainerStarted","Data":"b81d48d683952456bd0373579b63b94a60dc2b988574825a295d044d0ee2c067"} Apr 22 18:38:51.746802 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:51.746805 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bw86q" event={"ID":"6855095a-2036-4c98-8d4f-2bea5b4d8cd6","Type":"ContainerStarted","Data":"667e2ebda53623dead040352eab934cf79b9a024ea4f6f84a6239019b80ea863"} Apr 22 18:38:51.766637 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:51.766586 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bw86q" podStartSLOduration=1.766570814 podStartE2EDuration="1.766570814s" podCreationTimestamp="2026-04-22 18:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:38:51.765815528 +0000 UTC m=+60.756082157" watchObservedRunningTime="2026-04-22 18:38:51.766570814 +0000 UTC m=+60.756837443" Apr 22 18:38:51.983734 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:51.983688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:51.983929 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:51.983906 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:51.983997 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:51.983985 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls podName:2812ef50-c4e4-42a2-940f-01dd5bf968d4 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:55.983963202 +0000 UTC m=+64.974229812 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pc6pk" (UID: "2812ef50-c4e4-42a2-940f-01dd5bf968d4") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:52.084372 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:52.084278 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:52.084372 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:52.084332 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-p6c6r\" (UID: \"6ef4308b-cece-496e-a0f7-3f561908b409\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:38:52.084591 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:52.084442 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:38:52.084591 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:52.084535 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls podName:6ef4308b-cece-496e-a0f7-3f561908b409 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:56.084522697 +0000 UTC m=+65.074789304 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-p6c6r" (UID: "6ef4308b-cece-496e-a0f7-3f561908b409") : secret "samples-operator-tls" not found Apr 22 18:38:52.084591 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:52.084439 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:52.084591 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:52.084559 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d4498d6c-bkgtv: secret "image-registry-tls" not found Apr 22 18:38:52.084721 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:52.084617 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls podName:ef9d655a-2703-42fd-97e9-84bb1fe30e68 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:56.084603036 +0000 UTC m=+65.074869643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls") pod "image-registry-9d4498d6c-bkgtv" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68") : secret "image-registry-tls" not found Apr 22 18:38:53.891978 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:53.891942 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qrql8"] Apr 22 18:38:53.895951 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:53.895929 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qrql8" Apr 22 18:38:53.898658 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:53.898630 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-mqgck\"" Apr 22 18:38:53.899581 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:53.899561 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 18:38:53.899705 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:53.899615 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 18:38:53.902284 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:53.902035 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qrql8"] Apr 22 18:38:54.000815 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:54.000770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knwbs\" (UniqueName: \"kubernetes.io/projected/e9303b48-0205-46d7-b7c1-bef8c52ae1c2-kube-api-access-knwbs\") pod \"migrator-74bb7799d9-qrql8\" (UID: \"e9303b48-0205-46d7-b7c1-bef8c52ae1c2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qrql8" Apr 22 18:38:54.101510 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:54.101474 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knwbs\" (UniqueName: \"kubernetes.io/projected/e9303b48-0205-46d7-b7c1-bef8c52ae1c2-kube-api-access-knwbs\") pod \"migrator-74bb7799d9-qrql8\" (UID: \"e9303b48-0205-46d7-b7c1-bef8c52ae1c2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qrql8" Apr 22 18:38:54.111177 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:54.111141 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knwbs\" (UniqueName: \"kubernetes.io/projected/e9303b48-0205-46d7-b7c1-bef8c52ae1c2-kube-api-access-knwbs\") pod \"migrator-74bb7799d9-qrql8\" (UID: \"e9303b48-0205-46d7-b7c1-bef8c52ae1c2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qrql8" Apr 22 18:38:54.204585 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:54.204478 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qrql8" Apr 22 18:38:54.332886 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:54.332847 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qrql8"] Apr 22 18:38:54.338831 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:38:54.338790 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9303b48_0205_46d7_b7c1_bef8c52ae1c2.slice/crio-bb04e7c41d6b1567ada478970e6ec2c9644f392d118db72373d1f3952442ce48 WatchSource:0}: Error finding container bb04e7c41d6b1567ada478970e6ec2c9644f392d118db72373d1f3952442ce48: Status 404 returned error can't find the container with id bb04e7c41d6b1567ada478970e6ec2c9644f392d118db72373d1f3952442ce48 Apr 22 18:38:54.754352 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:54.754310 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qrql8" event={"ID":"e9303b48-0205-46d7-b7c1-bef8c52ae1c2","Type":"ContainerStarted","Data":"bb04e7c41d6b1567ada478970e6ec2c9644f392d118db72373d1f3952442ce48"} Apr 22 18:38:54.929758 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:54.929726 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-htng9_b0d1706a-8c90-4a85-abfe-07cf53827364/dns-node-resolver/0.log" Apr 22 18:38:55.732800 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:55.732771 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8vfmp_5cac15d4-0328-41c3-8bbc-e6d020fb09d2/node-ca/0.log" Apr 22 18:38:56.016854 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:56.016822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:38:56.017149 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:56.016990 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:56.017149 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:56.017053 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls podName:2812ef50-c4e4-42a2-940f-01dd5bf968d4 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:04.017035353 +0000 UTC m=+73.007301980 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pc6pk" (UID: "2812ef50-c4e4-42a2-940f-01dd5bf968d4") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:56.118275 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:56.118237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:38:56.118490 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:56.118286 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-p6c6r\" (UID: \"6ef4308b-cece-496e-a0f7-3f561908b409\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:38:56.118490 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:56.118385 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:38:56.118490 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:56.118389 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:56.118490 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:56.118408 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d4498d6c-bkgtv: secret "image-registry-tls" not found Apr 22 18:38:56.118490 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:56.118438 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls podName:6ef4308b-cece-496e-a0f7-3f561908b409 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:04.118424536 +0000 UTC m=+73.108691142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-p6c6r" (UID: "6ef4308b-cece-496e-a0f7-3f561908b409") : secret "samples-operator-tls" not found Apr 22 18:38:56.118490 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:56.118451 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls podName:ef9d655a-2703-42fd-97e9-84bb1fe30e68 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:04.118445686 +0000 UTC m=+73.108712292 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls") pod "image-registry-9d4498d6c-bkgtv" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68") : secret "image-registry-tls" not found Apr 22 18:38:56.320108 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:56.320017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:38:56.320108 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:56.320054 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:38:56.320292 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:56.320173 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:56.320292 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:56.320175 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:56.320292 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:56.320227 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls podName:eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:28.320213854 +0000 UTC m=+97.310480460 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls") pod "dns-default-sv8gg" (UID: "eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4") : secret "dns-default-metrics-tls" not found Apr 22 18:38:56.320292 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:56.320240 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert podName:c91e57b3-5752-444f-9b93-89af6a4673a4 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:28.32023408 +0000 UTC m=+97.310500687 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert") pod "ingress-canary-xpp6f" (UID: "c91e57b3-5752-444f-9b93-89af6a4673a4") : secret "canary-serving-cert" not found Apr 22 18:38:56.761193 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:56.761163 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qrql8" event={"ID":"e9303b48-0205-46d7-b7c1-bef8c52ae1c2","Type":"ContainerStarted","Data":"109a309301252e76f5111daa8089ed89ecaa76a1b6a515b7d7e35b944071f6e8"} Apr 22 18:38:56.761193 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:56.761199 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qrql8" event={"ID":"e9303b48-0205-46d7-b7c1-bef8c52ae1c2","Type":"ContainerStarted","Data":"5cffdbdff12377bff0fa73258abc2d64adbd90d776002b5a06b84bb3de612212"} Apr 22 18:38:56.778176 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:56.778125 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qrql8" podStartSLOduration=2.325146939 podStartE2EDuration="3.77811204s" podCreationTimestamp="2026-04-22 18:38:53 +0000 UTC" firstStartedPulling="2026-04-22 18:38:54.341224138 +0000 UTC m=+63.331490744" lastFinishedPulling="2026-04-22 18:38:55.794189222 +0000 UTC m=+64.784455845" observedRunningTime="2026-04-22 18:38:56.776886591 +0000 UTC m=+65.767153220" watchObservedRunningTime="2026-04-22 18:38:56.77811204 +0000 UTC m=+65.768378668" Apr 22 18:38:57.226721 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:38:57.226677 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:38:57.227116 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:57.226797 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:38:57.227116 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:38:57.226851 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs podName:9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f nodeName:}" failed. No retries permitted until 2026-04-22 18:40:01.226836905 +0000 UTC m=+130.217103512 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs") pod "network-metrics-daemon-bkhdr" (UID: "9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f") : secret "metrics-daemon-secret" not found Apr 22 18:39:01.707737 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:01.707707 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fqfx5" Apr 22 18:39:04.083628 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.083571 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:39:04.084027 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:39:04.083712 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:39:04.084027 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:39:04.083784 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls podName:2812ef50-c4e4-42a2-940f-01dd5bf968d4 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:20.083769247 +0000 UTC m=+89.074035854 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pc6pk" (UID: "2812ef50-c4e4-42a2-940f-01dd5bf968d4") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:39:04.184202 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.184167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-p6c6r\" (UID: \"6ef4308b-cece-496e-a0f7-3f561908b409\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:39:04.184366 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.184256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:39:04.188363 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.187872 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ef4308b-cece-496e-a0f7-3f561908b409-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-p6c6r\" (UID: \"6ef4308b-cece-496e-a0f7-3f561908b409\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:39:04.189686 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.189664 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls\") pod \"image-registry-9d4498d6c-bkgtv\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:39:04.285377 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.285335 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-d5wh4\"" Apr 22 18:39:04.293318 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.293290 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" Apr 22 18:39:04.307255 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.307228 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fbmmk\"" Apr 22 18:39:04.315414 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.315379 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:39:04.427126 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.427100 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r"] Apr 22 18:39:04.452766 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.452747 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9d4498d6c-bkgtv"] Apr 22 18:39:04.454786 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:39:04.454759 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef9d655a_2703_42fd_97e9_84bb1fe30e68.slice/crio-554630f40092bb6d61e67c67b301bd3f187b11eaf467ebe235eaf348092409f1 WatchSource:0}: Error finding container 554630f40092bb6d61e67c67b301bd3f187b11eaf467ebe235eaf348092409f1: Status 404 returned error can't find the container with id 554630f40092bb6d61e67c67b301bd3f187b11eaf467ebe235eaf348092409f1 Apr 22 18:39:04.782627 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.782591 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" event={"ID":"6ef4308b-cece-496e-a0f7-3f561908b409","Type":"ContainerStarted","Data":"2cb55765c0aea572b50e01f7af7b61108a9f2a56205c60a89ccb35c6ceff1911"} Apr 22 18:39:04.783983 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.783958 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" event={"ID":"ef9d655a-2703-42fd-97e9-84bb1fe30e68","Type":"ContainerStarted","Data":"658b809e489a65bdf87557d6c2552d122b60f3a276961d414adf4d111a9f02b1"} Apr 22 18:39:04.783983 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.783984 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" event={"ID":"ef9d655a-2703-42fd-97e9-84bb1fe30e68","Type":"ContainerStarted","Data":"554630f40092bb6d61e67c67b301bd3f187b11eaf467ebe235eaf348092409f1"} Apr 22 18:39:04.784172 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.784069 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:39:04.804897 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:04.804841 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" podStartSLOduration=16.804828958 podStartE2EDuration="16.804828958s" podCreationTimestamp="2026-04-22 18:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:39:04.804180392 +0000 UTC m=+73.794447023" watchObservedRunningTime="2026-04-22 18:39:04.804828958 +0000 UTC m=+73.795095586" Apr 22 18:39:07.794781 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:07.794746 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" event={"ID":"6ef4308b-cece-496e-a0f7-3f561908b409","Type":"ContainerStarted","Data":"4f59af80f777c534af88042498ff1f6a32c4866719683b24194c8874d6c04afb"} Apr 22 18:39:07.794781 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:07.794781 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" event={"ID":"6ef4308b-cece-496e-a0f7-3f561908b409","Type":"ContainerStarted","Data":"bbaaa60499c916be2becd60edac88f25e932b3bb23d19d65de5b329b3c758a3e"} Apr 22 18:39:07.816411 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:07.816354 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-p6c6r" podStartSLOduration=17.407164323 podStartE2EDuration="19.81634137s" podCreationTimestamp="2026-04-22 18:38:48 +0000 UTC" firstStartedPulling="2026-04-22 18:39:04.462779486 +0000 UTC m=+73.453046092" lastFinishedPulling="2026-04-22 18:39:06.871956527 +0000 UTC m=+75.862223139" observedRunningTime="2026-04-22 18:39:07.815836056 +0000 UTC m=+76.806102685" watchObservedRunningTime="2026-04-22 18:39:07.81634137 +0000 UTC m=+76.806607999" Apr 22 18:39:16.597269 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.597151 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-v8tzj"] Apr 22 18:39:16.602116 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.602094 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.608117 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.608093 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:39:16.608990 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.608974 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:39:16.609426 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.609410 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-x7wl2\"" Apr 22 18:39:16.609535 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.609518 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:39:16.614922 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.614902 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:39:16.632703 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.632671 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v8tzj"] Apr 22 18:39:16.678471 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.678437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-crio-socket\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.678685 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.678506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.678685 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.678595 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-data-volume\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.678685 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.678624 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.678685 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.678649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9thf\" (UniqueName: \"kubernetes.io/projected/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-kube-api-access-l9thf\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.702399 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.702366 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9d4498d6c-bkgtv"] Apr 22 18:39:16.779645 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.779611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-crio-socket\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.779645 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.779649 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.779838 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.779729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-data-volume\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.779838 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.779732 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-crio-socket\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.779838 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.779762 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.779838 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.779789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9thf\" (UniqueName: \"kubernetes.io/projected/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-kube-api-access-l9thf\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.780099 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.780079 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-data-volume\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.780195 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.780180 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.782119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.782099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.790498 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.790453 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9thf\" (UniqueName: \"kubernetes.io/projected/d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae-kube-api-access-l9thf\") pod \"insights-runtime-extractor-v8tzj\" (UID: \"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae\") " pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:16.911363 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:16.911332 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v8tzj" Apr 22 18:39:17.057808 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:17.057784 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v8tzj"] Apr 22 18:39:17.060092 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:39:17.060061 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd51fb0f2_c7fa_4df1_bc32_d4d93320f4ae.slice/crio-6830bd80acad2c70f3533eb56a58a54162b579778c60fe474099278b7122bd9d WatchSource:0}: Error finding container 6830bd80acad2c70f3533eb56a58a54162b579778c60fe474099278b7122bd9d: Status 404 returned error can't find the container with id 6830bd80acad2c70f3533eb56a58a54162b579778c60fe474099278b7122bd9d Apr 22 18:39:17.823906 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:17.823877 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v8tzj" event={"ID":"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae","Type":"ContainerStarted","Data":"6ae74533f38dd7b5e79326d363a0c68333f506a2b990d2051638d5492ab432b0"} Apr 22 18:39:17.823906 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:17.823911 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v8tzj" event={"ID":"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae","Type":"ContainerStarted","Data":"6830bd80acad2c70f3533eb56a58a54162b579778c60fe474099278b7122bd9d"} Apr 22 18:39:18.828340 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:18.828299 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v8tzj" event={"ID":"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae","Type":"ContainerStarted","Data":"a0c7617c34559d708f6114f0b0a3b2e87b9cd91b997ca44d45f5bf3077ac0b96"} Apr 22 18:39:19.832537 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:19.832498 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v8tzj" event={"ID":"d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae","Type":"ContainerStarted","Data":"887d3f5a816addece88eea3c9b1a6340c62c88e0d6963bbbc1ce8866d4a29163"} Apr 22 18:39:19.851355 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:19.851306 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-v8tzj" podStartSLOduration=1.738674997 podStartE2EDuration="3.851292396s" podCreationTimestamp="2026-04-22 18:39:16 +0000 UTC" firstStartedPulling="2026-04-22 18:39:17.115165892 +0000 UTC m=+86.105432500" lastFinishedPulling="2026-04-22 18:39:19.227783292 +0000 UTC m=+88.218049899" observedRunningTime="2026-04-22 18:39:19.85005684 +0000 UTC m=+88.840323480" watchObservedRunningTime="2026-04-22 18:39:19.851292396 +0000 UTC m=+88.841559025" Apr 22 18:39:20.107714 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:20.107616 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:39:20.110049 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:20.110012 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2812ef50-c4e4-42a2-940f-01dd5bf968d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pc6pk\" (UID: \"2812ef50-c4e4-42a2-940f-01dd5bf968d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:39:20.381010 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:20.380927 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-zclrv\"" Apr 22 18:39:20.389569 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:20.389522 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" Apr 22 18:39:20.508869 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:20.508832 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk"] Apr 22 18:39:20.513038 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:39:20.513001 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2812ef50_c4e4_42a2_940f_01dd5bf968d4.slice/crio-ef19fca0cc82395d955f1ebaf63116845e72680d154ed14eac6472c0ad5e5f0a WatchSource:0}: Error finding container ef19fca0cc82395d955f1ebaf63116845e72680d154ed14eac6472c0ad5e5f0a: Status 404 returned error can't find the container with id ef19fca0cc82395d955f1ebaf63116845e72680d154ed14eac6472c0ad5e5f0a Apr 22 18:39:20.836167 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:20.836125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" event={"ID":"2812ef50-c4e4-42a2-940f-01dd5bf968d4","Type":"ContainerStarted","Data":"ef19fca0cc82395d955f1ebaf63116845e72680d154ed14eac6472c0ad5e5f0a"} Apr 22 18:39:22.845220 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:22.845184 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" event={"ID":"2812ef50-c4e4-42a2-940f-01dd5bf968d4","Type":"ContainerStarted","Data":"5d6867c3044fcbac00d5c7fed0b52c5b95625c86ae2ff6e251d6b805a1327b22"} Apr 22 18:39:22.862211 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:22.862152 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pc6pk" podStartSLOduration=33.178118734 podStartE2EDuration="34.862137117s" podCreationTimestamp="2026-04-22 18:38:48 +0000 UTC" firstStartedPulling="2026-04-22 18:39:20.515219372 +0000 UTC m=+89.505485978" lastFinishedPulling="2026-04-22 18:39:22.19923774 +0000 UTC m=+91.189504361" observedRunningTime="2026-04-22 18:39:22.861501276 +0000 UTC m=+91.851767906" watchObservedRunningTime="2026-04-22 18:39:22.862137117 +0000 UTC m=+91.852403784" Apr 22 18:39:23.205828 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.205792 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5499c67d88-dvxcr"] Apr 22 18:39:23.208976 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.208958 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.211515 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.211493 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:39:23.211844 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.211816 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:39:23.211844 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.211836 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:39:23.212028 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.211889 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:39:23.212028 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.211914 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-xkjq2\"" Apr 22 18:39:23.212028 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.211920 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:39:23.212028 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.211965 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:39:23.212663 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.212637 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:39:23.219594 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.219562 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5499c67d88-dvxcr"] Apr 22 18:39:23.330943 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.330902 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-service-ca\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.331137 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.330965 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-console-config\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.331137 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.331009 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm7hx\" (UniqueName: \"kubernetes.io/projected/d05fdc27-f402-4383-be60-02c7fb718f9a-kube-api-access-bm7hx\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.331137 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.331036 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-oauth-serving-cert\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.331137 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.331092 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d05fdc27-f402-4383-be60-02c7fb718f9a-console-oauth-config\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.331137 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.331133 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d05fdc27-f402-4383-be60-02c7fb718f9a-console-serving-cert\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.432044 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.432007 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-oauth-serving-cert\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.432124 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.432049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d05fdc27-f402-4383-be60-02c7fb718f9a-console-oauth-config\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.432124 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.432098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d05fdc27-f402-4383-be60-02c7fb718f9a-console-serving-cert\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.432189 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.432142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-service-ca\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.432224 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.432187 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-console-config\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.432256 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.432230 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bm7hx\" (UniqueName: \"kubernetes.io/projected/d05fdc27-f402-4383-be60-02c7fb718f9a-kube-api-access-bm7hx\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.432814 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.432780 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-oauth-serving-cert\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.432814 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.432808 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-console-config\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.432972 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.432897 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-service-ca\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.435071 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.435054 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d05fdc27-f402-4383-be60-02c7fb718f9a-console-oauth-config\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.435243 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.435223 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d05fdc27-f402-4383-be60-02c7fb718f9a-console-serving-cert\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.440781 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.440749 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm7hx\" (UniqueName: \"kubernetes.io/projected/d05fdc27-f402-4383-be60-02c7fb718f9a-kube-api-access-bm7hx\") pod \"console-5499c67d88-dvxcr\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.522891 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.522805 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:23.640382 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.640342 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5499c67d88-dvxcr"] Apr 22 18:39:23.643403 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:39:23.643375 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd05fdc27_f402_4383_be60_02c7fb718f9a.slice/crio-bac2b85064c9bca91ce4bec78c2dd8e810abee8fa90e033b447e72b0a4e757f1 WatchSource:0}: Error finding container bac2b85064c9bca91ce4bec78c2dd8e810abee8fa90e033b447e72b0a4e757f1: Status 404 returned error can't find the container with id bac2b85064c9bca91ce4bec78c2dd8e810abee8fa90e033b447e72b0a4e757f1 Apr 22 18:39:23.849179 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:23.849090 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5499c67d88-dvxcr" event={"ID":"d05fdc27-f402-4383-be60-02c7fb718f9a","Type":"ContainerStarted","Data":"bac2b85064c9bca91ce4bec78c2dd8e810abee8fa90e033b447e72b0a4e757f1"} Apr 22 18:39:24.629079 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.629042 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-777c754768-2vxw5"] Apr 22 18:39:24.632191 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.632168 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.640862 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.640626 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 18:39:24.642108 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.642084 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-777c754768-2vxw5"] Apr 22 18:39:24.743197 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.743158 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf80386c-cac4-45f7-85c0-e71274afcf40-console-serving-cert\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.743393 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.743264 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf80386c-cac4-45f7-85c0-e71274afcf40-console-oauth-config\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.743393 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.743322 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-trusted-ca-bundle\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.743393 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.743361 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-console-config\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.743580 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.743424 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-oauth-serving-cert\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.743580 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.743480 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2qll\" (UniqueName: \"kubernetes.io/projected/cf80386c-cac4-45f7-85c0-e71274afcf40-kube-api-access-v2qll\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.743580 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.743526 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-service-ca\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.844514 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.844450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf80386c-cac4-45f7-85c0-e71274afcf40-console-serving-cert\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.844718 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.844544 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf80386c-cac4-45f7-85c0-e71274afcf40-console-oauth-config\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.844718 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.844588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-trusted-ca-bundle\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.844718 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.844626 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-console-config\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.844718 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.844684 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-oauth-serving-cert\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.844929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.844823 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2qll\" (UniqueName: \"kubernetes.io/projected/cf80386c-cac4-45f7-85c0-e71274afcf40-kube-api-access-v2qll\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.844929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.844871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-service-ca\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.845435 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.845409 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-console-config\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.845573 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.845438 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-oauth-serving-cert\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.845573 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.845545 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-service-ca\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.845716 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.845698 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-trusted-ca-bundle\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.847012 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.846985 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf80386c-cac4-45f7-85c0-e71274afcf40-console-oauth-config\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.847137 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.847117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf80386c-cac4-45f7-85c0-e71274afcf40-console-serving-cert\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.853320 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.853291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2qll\" (UniqueName: \"kubernetes.io/projected/cf80386c-cac4-45f7-85c0-e71274afcf40-kube-api-access-v2qll\") pod \"console-777c754768-2vxw5\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:24.943627 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:24.943585 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:25.080799 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:25.080746 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-777c754768-2vxw5"] Apr 22 18:39:25.083121 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:39:25.083090 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf80386c_cac4_45f7_85c0_e71274afcf40.slice/crio-c86ceb3320dd4d1e6db14a851d9f3701904d70d7b2687ea5826dc7255ee4332b WatchSource:0}: Error finding container c86ceb3320dd4d1e6db14a851d9f3701904d70d7b2687ea5826dc7255ee4332b: Status 404 returned error can't find the container with id c86ceb3320dd4d1e6db14a851d9f3701904d70d7b2687ea5826dc7255ee4332b Apr 22 18:39:25.855602 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:25.855561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-777c754768-2vxw5" event={"ID":"cf80386c-cac4-45f7-85c0-e71274afcf40","Type":"ContainerStarted","Data":"c86ceb3320dd4d1e6db14a851d9f3701904d70d7b2687ea5826dc7255ee4332b"} Apr 22 18:39:26.707094 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:26.707055 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:39:26.860400 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:26.860303 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5499c67d88-dvxcr" event={"ID":"d05fdc27-f402-4383-be60-02c7fb718f9a","Type":"ContainerStarted","Data":"bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5"} Apr 22 18:39:26.861708 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:26.861681 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-777c754768-2vxw5" event={"ID":"cf80386c-cac4-45f7-85c0-e71274afcf40","Type":"ContainerStarted","Data":"298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a"} Apr 22 18:39:26.878142 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:26.878092 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5499c67d88-dvxcr" podStartSLOduration=0.995740878 podStartE2EDuration="3.878076505s" podCreationTimestamp="2026-04-22 18:39:23 +0000 UTC" firstStartedPulling="2026-04-22 18:39:23.645824192 +0000 UTC m=+92.636090799" lastFinishedPulling="2026-04-22 18:39:26.528159807 +0000 UTC m=+95.518426426" observedRunningTime="2026-04-22 18:39:26.87694928 +0000 UTC m=+95.867215922" watchObservedRunningTime="2026-04-22 18:39:26.878076505 +0000 UTC m=+95.868343134" Apr 22 18:39:26.893950 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:26.893898 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-777c754768-2vxw5" podStartSLOduration=1.450447904 podStartE2EDuration="2.893885099s" podCreationTimestamp="2026-04-22 18:39:24 +0000 UTC" firstStartedPulling="2026-04-22 18:39:25.08537022 +0000 UTC m=+94.075636840" lastFinishedPulling="2026-04-22 18:39:26.528807413 +0000 UTC m=+95.519074035" observedRunningTime="2026-04-22 18:39:26.893421621 +0000 UTC m=+95.883688251" watchObservedRunningTime="2026-04-22 18:39:26.893885099 +0000 UTC m=+95.884151727" Apr 22 18:39:28.377294 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:28.377240 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:39:28.377723 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:28.377349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:39:28.379770 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:28.379745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91e57b3-5752-444f-9b93-89af6a4673a4-cert\") pod \"ingress-canary-xpp6f\" (UID: \"c91e57b3-5752-444f-9b93-89af6a4673a4\") " pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:39:28.379846 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:28.379752 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4-metrics-tls\") pod \"dns-default-sv8gg\" (UID: \"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4\") " pod="openshift-dns/dns-default-sv8gg" Apr 22 18:39:28.601716 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:28.601677 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cfw2h\"" Apr 22 18:39:28.608060 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:28.608029 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rz76n\"" Apr 22 18:39:28.609922 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:28.609906 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sv8gg" Apr 22 18:39:28.616930 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:28.616901 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xpp6f" Apr 22 18:39:28.766453 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:28.766427 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sv8gg"] Apr 22 18:39:28.768985 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:39:28.768957 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4a471c_0b73_4b4f_9dcf_8c9a29e743e4.slice/crio-4da2b4c3a998f08063a2b9ffc649428630751d6479b9ea61761827935c83921e WatchSource:0}: Error finding container 4da2b4c3a998f08063a2b9ffc649428630751d6479b9ea61761827935c83921e: Status 404 returned error can't find the container with id 4da2b4c3a998f08063a2b9ffc649428630751d6479b9ea61761827935c83921e Apr 22 18:39:28.782663 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:28.782638 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xpp6f"] Apr 22 18:39:28.785238 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:39:28.785210 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc91e57b3_5752_444f_9b93_89af6a4673a4.slice/crio-32ab0c8e2fae9a53844b5cd9da81394e995d7ff9b72490ddef4b785245c9d8f2 WatchSource:0}: Error finding container 32ab0c8e2fae9a53844b5cd9da81394e995d7ff9b72490ddef4b785245c9d8f2: Status 404 returned error can't find the container with id 32ab0c8e2fae9a53844b5cd9da81394e995d7ff9b72490ddef4b785245c9d8f2 Apr 22 18:39:28.867973 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:28.867938 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xpp6f" event={"ID":"c91e57b3-5752-444f-9b93-89af6a4673a4","Type":"ContainerStarted","Data":"32ab0c8e2fae9a53844b5cd9da81394e995d7ff9b72490ddef4b785245c9d8f2"} Apr 22 18:39:28.868954 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:28.868926 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sv8gg" event={"ID":"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4","Type":"ContainerStarted","Data":"4da2b4c3a998f08063a2b9ffc649428630751d6479b9ea61761827935c83921e"} Apr 22 18:39:30.178075 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.178033 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-w87gw"] Apr 22 18:39:30.182614 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.182584 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.185017 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.184934 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:39:30.185017 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.184993 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:39:30.185276 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.185127 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:39:30.185276 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.185173 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:39:30.185948 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.185928 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mt2n7\"" Apr 22 18:39:30.192915 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.192888 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-accelerators-collector-config\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.193045 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.192924 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-wtmp\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.193045 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.192950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf4nc\" (UniqueName: \"kubernetes.io/projected/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-kube-api-access-pf4nc\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.193045 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.193036 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-textfile\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.193193 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.193088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.193193 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.193157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-tls\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.193193 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.193183 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-sys\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.193342 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.193213 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-root\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.193342 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.193236 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-metrics-client-ca\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.294482 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.294413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-textfile\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.294482 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.294484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.294727 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.294540 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-tls\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.294727 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.294568 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-sys\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.294727 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.294597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-root\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.294727 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.294618 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-metrics-client-ca\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.294727 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:39:30.294670 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:39:30.294727 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.294694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-accelerators-collector-config\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.295029 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:39:30.294742 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-tls podName:94e10a47-4857-4cff-9fbb-fbf8f76c8ae6 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:30.79472017 +0000 UTC m=+99.784986778 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-tls") pod "node-exporter-w87gw" (UID: "94e10a47-4857-4cff-9fbb-fbf8f76c8ae6") : secret "node-exporter-tls" not found Apr 22 18:39:30.295029 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.294757 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-root\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.295029 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.294813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-wtmp\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.295029 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.294829 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-textfile\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.295029 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.294846 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pf4nc\" (UniqueName: \"kubernetes.io/projected/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-kube-api-access-pf4nc\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.295029 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.294891 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-sys\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.295029 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.295008 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-wtmp\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.295358 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.295192 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-accelerators-collector-config\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.295358 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.295219 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-metrics-client-ca\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.297117 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.297097 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.303903 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.303880 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf4nc\" (UniqueName: \"kubernetes.io/projected/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-kube-api-access-pf4nc\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.799667 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.799632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-tls\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.802551 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.802490 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/94e10a47-4857-4cff-9fbb-fbf8f76c8ae6-node-exporter-tls\") pod \"node-exporter-w87gw\" (UID: \"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6\") " pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:30.876656 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.876622 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xpp6f" event={"ID":"c91e57b3-5752-444f-9b93-89af6a4673a4","Type":"ContainerStarted","Data":"cd09a1ff5c4a99c2e17de559417aebaee466cf3b13c2f60aec5e71e0651bf5d5"} Apr 22 18:39:30.878221 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.878182 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sv8gg" event={"ID":"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4","Type":"ContainerStarted","Data":"aa06b4f6f96698be5e8c4c8973e677ecfbe8dced7bc581a609b862a3401a50bf"} Apr 22 18:39:30.900290 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:30.897842 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xpp6f" podStartSLOduration=64.937550053 podStartE2EDuration="1m6.897819679s" podCreationTimestamp="2026-04-22 18:38:24 +0000 UTC" firstStartedPulling="2026-04-22 18:39:28.786965113 +0000 UTC m=+97.777231720" lastFinishedPulling="2026-04-22 18:39:30.747234738 +0000 UTC m=+99.737501346" observedRunningTime="2026-04-22 18:39:30.896534118 +0000 UTC m=+99.886800747" watchObservedRunningTime="2026-04-22 18:39:30.897819679 +0000 UTC m=+99.888086323" Apr 22 18:39:31.095785 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:31.095745 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-w87gw" Apr 22 18:39:31.104072 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:39:31.104044 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94e10a47_4857_4cff_9fbb_fbf8f76c8ae6.slice/crio-2624174e0cdbe4739bf082fe36e77f29df578b24323b4c185b951af1ea6d812a WatchSource:0}: Error finding container 2624174e0cdbe4739bf082fe36e77f29df578b24323b4c185b951af1ea6d812a: Status 404 returned error can't find the container with id 2624174e0cdbe4739bf082fe36e77f29df578b24323b4c185b951af1ea6d812a Apr 22 18:39:31.882947 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:31.882911 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w87gw" event={"ID":"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6","Type":"ContainerStarted","Data":"2624174e0cdbe4739bf082fe36e77f29df578b24323b4c185b951af1ea6d812a"} Apr 22 18:39:31.884529 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:31.884501 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sv8gg" event={"ID":"eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4","Type":"ContainerStarted","Data":"9f75cde730f8f7f4fbb7ee270bb390d5a5eab6ba0ded2ca7980c048350b2e513"} Apr 22 18:39:31.884666 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:31.884589 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-sv8gg" Apr 22 18:39:31.909912 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:31.909866 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sv8gg" podStartSLOduration=65.936541848 podStartE2EDuration="1m7.909852345s" podCreationTimestamp="2026-04-22 18:38:24 +0000 UTC" firstStartedPulling="2026-04-22 18:39:28.770999829 +0000 UTC m=+97.761266435" lastFinishedPulling="2026-04-22 18:39:30.744310321 +0000 UTC m=+99.734576932" observedRunningTime="2026-04-22 18:39:31.908841256 +0000 UTC m=+100.899107885" watchObservedRunningTime="2026-04-22 18:39:31.909852345 +0000 UTC m=+100.900118974" Apr 22 18:39:32.103959 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.103869 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8"] Apr 22 18:39:32.107204 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.107181 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.109614 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.109592 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 18:39:32.109718 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.109593 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-d3chg8n3gh2sv\"" Apr 22 18:39:32.109718 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.109639 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 18:39:32.109718 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.109596 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 18:39:32.109718 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.109676 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 18:39:32.110078 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.110064 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-npqsg\"" Apr 22 18:39:32.110145 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.110128 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 18:39:32.118816 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.118794 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8"] Apr 22 18:39:32.211513 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.211441 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.211715 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.211522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.211715 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.211556 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.211715 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.211615 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-grpc-tls\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.211715 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.211641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-tls\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.211715 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.211666 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj2qz\" (UniqueName: \"kubernetes.io/projected/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-kube-api-access-kj2qz\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.211917 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.211709 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-metrics-client-ca\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.211917 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.211784 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.312395 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.312343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-grpc-tls\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.312395 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.312399 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-tls\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.312700 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.312517 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kj2qz\" (UniqueName: \"kubernetes.io/projected/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-kube-api-access-kj2qz\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.312700 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.312558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-metrics-client-ca\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.312700 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.312599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.312700 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.312629 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.312700 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.312655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.312700 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.312682 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.313217 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.313198 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-metrics-client-ca\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.314997 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.314963 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-tls\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.315812 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.315767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.316020 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.315999 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.316207 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.316186 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.316247 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.316218 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-grpc-tls\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.316247 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.316186 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.320937 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.320907 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj2qz\" (UniqueName: \"kubernetes.io/projected/197c7922-b84c-45ba-a3ee-a9b7ca6e75d5-kube-api-access-kj2qz\") pod \"thanos-querier-6f8f9b78fb-vdqt8\" (UID: \"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5\") " pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.416580 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.416532 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:32.541058 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.541022 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8"] Apr 22 18:39:32.545363 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:39:32.545338 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod197c7922_b84c_45ba_a3ee_a9b7ca6e75d5.slice/crio-fde014d2d6b1616ab37491ff0977310919999ec643e6b0678121d55143ea398e WatchSource:0}: Error finding container fde014d2d6b1616ab37491ff0977310919999ec643e6b0678121d55143ea398e: Status 404 returned error can't find the container with id fde014d2d6b1616ab37491ff0977310919999ec643e6b0678121d55143ea398e Apr 22 18:39:32.888634 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.888599 2578 generic.go:358] "Generic (PLEG): container finished" podID="94e10a47-4857-4cff-9fbb-fbf8f76c8ae6" containerID="4728dc060e7b1f4358c751a77c6f673d813266b809525740b875c47018a07f61" exitCode=0 Apr 22 18:39:32.889067 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.888683 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w87gw" event={"ID":"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6","Type":"ContainerDied","Data":"4728dc060e7b1f4358c751a77c6f673d813266b809525740b875c47018a07f61"} Apr 22 18:39:32.889870 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:32.889844 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" event={"ID":"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5","Type":"ContainerStarted","Data":"fde014d2d6b1616ab37491ff0977310919999ec643e6b0678121d55143ea398e"} Apr 22 18:39:33.523256 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:33.523214 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:33.523443 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:33.523271 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:33.528398 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:33.528372 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:33.897739 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:33.897700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w87gw" event={"ID":"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6","Type":"ContainerStarted","Data":"90c3ca1e2184f355cad7b3f85e127b1126bd1810726c7327f3d534655df280a8"} Apr 22 18:39:33.898367 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:33.897746 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w87gw" event={"ID":"94e10a47-4857-4cff-9fbb-fbf8f76c8ae6","Type":"ContainerStarted","Data":"7549cca42cfc2fb07a27155e5b38a0ee2e7b26d8cd1b6b3c225997a690c29689"} Apr 22 18:39:33.902124 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:33.902098 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:39:33.918582 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:33.918527 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-w87gw" podStartSLOduration=3.202393455 podStartE2EDuration="3.91851001s" podCreationTimestamp="2026-04-22 18:39:30 +0000 UTC" firstStartedPulling="2026-04-22 18:39:31.105931904 +0000 UTC m=+100.096198513" lastFinishedPulling="2026-04-22 18:39:31.822048455 +0000 UTC m=+100.812315068" observedRunningTime="2026-04-22 18:39:33.917116932 +0000 UTC m=+102.907383563" watchObservedRunningTime="2026-04-22 18:39:33.91851001 +0000 UTC m=+102.908776640" Apr 22 18:39:34.902617 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:34.902576 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" event={"ID":"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5","Type":"ContainerStarted","Data":"5c5abdf0c0861293a02c8ba918c9950c8a0fb0af76cb7cb400c6abf1a63dfcd1"} Apr 22 18:39:34.902617 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:34.902622 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" event={"ID":"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5","Type":"ContainerStarted","Data":"b634a7279effff06da55ca850cefc2f79ba05b423793220e18dd0daacb79e0e4"} Apr 22 18:39:34.903115 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:34.902637 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" event={"ID":"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5","Type":"ContainerStarted","Data":"a94d9a2366354d86fad55933171202c5011b77f07114cf4ca62fd1b2cb35f4a5"} Apr 22 18:39:34.944201 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:34.944166 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:34.944357 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:34.944215 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:34.949093 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:34.949064 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:35.907535 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:35.907494 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" event={"ID":"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5","Type":"ContainerStarted","Data":"e9eaf2f6b0423f413add10c65b90a518b1bc0bd0ca338f5cbdbc047e70fda22c"} Apr 22 18:39:35.907976 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:35.907544 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" event={"ID":"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5","Type":"ContainerStarted","Data":"52e4fba87d69d818785fec0660cd12010026885bfd20cbfcece19f3c6acc4832"} Apr 22 18:39:35.907976 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:35.907558 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" event={"ID":"197c7922-b84c-45ba-a3ee-a9b7ca6e75d5","Type":"ContainerStarted","Data":"092cc57244d2507e44063d67026664a12f7143d724763fc38956436e07ad37ce"} Apr 22 18:39:35.908090 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:35.907996 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:35.911703 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:35.911682 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:39:35.935524 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:35.935455 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" podStartSLOduration=1.146512143 podStartE2EDuration="3.935442448s" podCreationTimestamp="2026-04-22 18:39:32 +0000 UTC" firstStartedPulling="2026-04-22 18:39:32.547030162 +0000 UTC m=+101.537296769" lastFinishedPulling="2026-04-22 18:39:35.335960467 +0000 UTC m=+104.326227074" observedRunningTime="2026-04-22 18:39:35.933769784 +0000 UTC m=+104.924036413" watchObservedRunningTime="2026-04-22 18:39:35.935442448 +0000 UTC m=+104.925709077" Apr 22 18:39:35.981377 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:35.981326 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5499c67d88-dvxcr"] Apr 22 18:39:41.258299 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.258261 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-685b7cdf4f-fdsjf"] Apr 22 18:39:41.261598 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.261576 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.270195 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.270165 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-685b7cdf4f-fdsjf"] Apr 22 18:39:41.294607 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.294559 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-trusted-ca-bundle\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.294607 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.294610 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-oauth-config\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.294847 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.294653 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-oauth-serving-cert\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.294847 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.294683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-serving-cert\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.294847 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.294768 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-config\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.294847 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.294806 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2bxl\" (UniqueName: \"kubernetes.io/projected/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-kube-api-access-d2bxl\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.294847 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.294834 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-service-ca\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.395782 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.395740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-service-ca\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.395966 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.395817 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-trusted-ca-bundle\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.395966 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.395854 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-oauth-config\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.395966 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.395871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-oauth-serving-cert\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.395966 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.395893 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-serving-cert\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.395966 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.395932 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-config\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.396228 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.395955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2bxl\" (UniqueName: \"kubernetes.io/projected/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-kube-api-access-d2bxl\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.396632 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.396606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-service-ca\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.396826 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.396709 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-oauth-serving-cert\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.396826 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.396715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-config\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.396956 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.396905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-trusted-ca-bundle\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.398358 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.398334 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-serving-cert\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.398452 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.398340 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-oauth-config\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.407300 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.407271 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2bxl\" (UniqueName: \"kubernetes.io/projected/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-kube-api-access-d2bxl\") pod \"console-685b7cdf4f-fdsjf\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.571215 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.571126 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:41.695536 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.695513 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-685b7cdf4f-fdsjf"] Apr 22 18:39:41.697964 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:39:41.697934 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd80a83_e929_4ac8_ab34_dea0dd64ef2e.slice/crio-26528e2f03f9551afb844e516b9e9c8c690b7ae3e643213fc1f8dcca861af7f4 WatchSource:0}: Error finding container 26528e2f03f9551afb844e516b9e9c8c690b7ae3e643213fc1f8dcca861af7f4: Status 404 returned error can't find the container with id 26528e2f03f9551afb844e516b9e9c8c690b7ae3e643213fc1f8dcca861af7f4 Apr 22 18:39:41.720936 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.720891 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" podUID="ef9d655a-2703-42fd-97e9-84bb1fe30e68" containerName="registry" containerID="cri-o://658b809e489a65bdf87557d6c2552d122b60f3a276961d414adf4d111a9f02b1" gracePeriod=30 Apr 22 18:39:41.892434 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.892407 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sv8gg" Apr 22 18:39:41.916648 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.916615 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6f8f9b78fb-vdqt8" Apr 22 18:39:41.927141 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.927107 2578 generic.go:358] "Generic (PLEG): container finished" podID="ef9d655a-2703-42fd-97e9-84bb1fe30e68" containerID="658b809e489a65bdf87557d6c2552d122b60f3a276961d414adf4d111a9f02b1" exitCode=0 Apr 22 18:39:41.927316 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.927178 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" event={"ID":"ef9d655a-2703-42fd-97e9-84bb1fe30e68","Type":"ContainerDied","Data":"658b809e489a65bdf87557d6c2552d122b60f3a276961d414adf4d111a9f02b1"} Apr 22 18:39:41.928729 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.928702 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-685b7cdf4f-fdsjf" event={"ID":"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e","Type":"ContainerStarted","Data":"ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a"} Apr 22 18:39:41.928861 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.928735 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-685b7cdf4f-fdsjf" event={"ID":"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e","Type":"ContainerStarted","Data":"26528e2f03f9551afb844e516b9e9c8c690b7ae3e643213fc1f8dcca861af7f4"} Apr 22 18:39:41.951259 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:41.951235 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:39:42.000530 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.000490 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef9d655a-2703-42fd-97e9-84bb1fe30e68-trusted-ca\") pod \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " Apr 22 18:39:42.000718 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.000551 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls\") pod \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " Apr 22 18:39:42.000718 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.000644 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6fb6\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-kube-api-access-q6fb6\") pod \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " Apr 22 18:39:42.000832 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.000734 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-certificates\") pod \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " Apr 22 18:39:42.000988 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.000959 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9d655a-2703-42fd-97e9-84bb1fe30e68-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ef9d655a-2703-42fd-97e9-84bb1fe30e68" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:42.001051 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.001018 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-bound-sa-token\") pod \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " Apr 22 18:39:42.001227 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.001195 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ef9d655a-2703-42fd-97e9-84bb1fe30e68" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:42.001401 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.001384 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ef9d655a-2703-42fd-97e9-84bb1fe30e68-ca-trust-extracted\") pod \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " Apr 22 18:39:42.001487 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.001438 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ef9d655a-2703-42fd-97e9-84bb1fe30e68-installation-pull-secrets\") pod \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " Apr 22 18:39:42.001808 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.001787 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ef9d655a-2703-42fd-97e9-84bb1fe30e68-image-registry-private-configuration\") pod \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\" (UID: \"ef9d655a-2703-42fd-97e9-84bb1fe30e68\") " Apr 22 18:39:42.002400 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.002376 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-certificates\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:39:42.002581 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.002409 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef9d655a-2703-42fd-97e9-84bb1fe30e68-trusted-ca\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:39:42.006911 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.006875 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-kube-api-access-q6fb6" (OuterVolumeSpecName: "kube-api-access-q6fb6") pod "ef9d655a-2703-42fd-97e9-84bb1fe30e68" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68"). InnerVolumeSpecName "kube-api-access-q6fb6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:39:42.007118 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.007077 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ef9d655a-2703-42fd-97e9-84bb1fe30e68" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:39:42.007214 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.007169 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9d655a-2703-42fd-97e9-84bb1fe30e68-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ef9d655a-2703-42fd-97e9-84bb1fe30e68" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:39:42.007214 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.007183 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9d655a-2703-42fd-97e9-84bb1fe30e68-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ef9d655a-2703-42fd-97e9-84bb1fe30e68" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:39:42.007448 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.007421 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ef9d655a-2703-42fd-97e9-84bb1fe30e68" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:39:42.012996 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.012964 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef9d655a-2703-42fd-97e9-84bb1fe30e68-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ef9d655a-2703-42fd-97e9-84bb1fe30e68" (UID: "ef9d655a-2703-42fd-97e9-84bb1fe30e68"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:39:42.013113 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.013018 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-685b7cdf4f-fdsjf" podStartSLOduration=1.012997238 podStartE2EDuration="1.012997238s" podCreationTimestamp="2026-04-22 18:39:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:39:42.010798618 +0000 UTC m=+111.001065250" watchObservedRunningTime="2026-04-22 18:39:42.012997238 +0000 UTC m=+111.003263868" Apr 22 18:39:42.103615 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.103528 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-bound-sa-token\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:39:42.103615 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.103560 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ef9d655a-2703-42fd-97e9-84bb1fe30e68-ca-trust-extracted\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:39:42.103615 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.103572 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ef9d655a-2703-42fd-97e9-84bb1fe30e68-installation-pull-secrets\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:39:42.103615 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.103584 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ef9d655a-2703-42fd-97e9-84bb1fe30e68-image-registry-private-configuration\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:39:42.103615 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.103595 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-registry-tls\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:39:42.103615 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.103604 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q6fb6\" (UniqueName: \"kubernetes.io/projected/ef9d655a-2703-42fd-97e9-84bb1fe30e68-kube-api-access-q6fb6\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:39:42.933003 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.932971 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" event={"ID":"ef9d655a-2703-42fd-97e9-84bb1fe30e68","Type":"ContainerDied","Data":"554630f40092bb6d61e67c67b301bd3f187b11eaf467ebe235eaf348092409f1"} Apr 22 18:39:42.933003 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.932987 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9d4498d6c-bkgtv" Apr 22 18:39:42.933003 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.933011 2578 scope.go:117] "RemoveContainer" containerID="658b809e489a65bdf87557d6c2552d122b60f3a276961d414adf4d111a9f02b1" Apr 22 18:39:42.957013 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.956980 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9d4498d6c-bkgtv"] Apr 22 18:39:42.961154 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:42.961126 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-9d4498d6c-bkgtv"] Apr 22 18:39:43.538396 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:43.538362 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef9d655a-2703-42fd-97e9-84bb1fe30e68" path="/var/lib/kubelet/pods/ef9d655a-2703-42fd-97e9-84bb1fe30e68/volumes" Apr 22 18:39:51.571454 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:51.571409 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:51.571454 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:51.571480 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:51.576019 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:51.575990 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:51.961279 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:51.961250 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:39:52.007129 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:39:52.007093 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-777c754768-2vxw5"] Apr 22 18:40:01.001208 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.001149 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5499c67d88-dvxcr" podUID="d05fdc27-f402-4383-be60-02c7fb718f9a" containerName="console" containerID="cri-o://bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5" gracePeriod=15 Apr 22 18:40:01.234049 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.234030 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5499c67d88-dvxcr_d05fdc27-f402-4383-be60-02c7fb718f9a/console/0.log" Apr 22 18:40:01.234159 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.234088 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:40:01.268380 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.268296 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:40:01.270513 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.270492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f-metrics-certs\") pod \"network-metrics-daemon-bkhdr\" (UID: \"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f\") " pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:40:01.369217 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.369183 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-oauth-serving-cert\") pod \"d05fdc27-f402-4383-be60-02c7fb718f9a\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " Apr 22 18:40:01.369378 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.369259 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-console-config\") pod \"d05fdc27-f402-4383-be60-02c7fb718f9a\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " Apr 22 18:40:01.369425 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.369360 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-service-ca\") pod \"d05fdc27-f402-4383-be60-02c7fb718f9a\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " Apr 22 18:40:01.369425 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.369412 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d05fdc27-f402-4383-be60-02c7fb718f9a-console-serving-cert\") pod \"d05fdc27-f402-4383-be60-02c7fb718f9a\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " Apr 22 18:40:01.369519 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.369449 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d05fdc27-f402-4383-be60-02c7fb718f9a-console-oauth-config\") pod \"d05fdc27-f402-4383-be60-02c7fb718f9a\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " Apr 22 18:40:01.369519 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.369494 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm7hx\" (UniqueName: \"kubernetes.io/projected/d05fdc27-f402-4383-be60-02c7fb718f9a-kube-api-access-bm7hx\") pod \"d05fdc27-f402-4383-be60-02c7fb718f9a\" (UID: \"d05fdc27-f402-4383-be60-02c7fb718f9a\") " Apr 22 18:40:01.369703 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.369676 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-console-config" (OuterVolumeSpecName: "console-config") pod "d05fdc27-f402-4383-be60-02c7fb718f9a" (UID: "d05fdc27-f402-4383-be60-02c7fb718f9a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:01.369776 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.369703 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d05fdc27-f402-4383-be60-02c7fb718f9a" (UID: "d05fdc27-f402-4383-be60-02c7fb718f9a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:01.369820 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.369781 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-service-ca" (OuterVolumeSpecName: "service-ca") pod "d05fdc27-f402-4383-be60-02c7fb718f9a" (UID: "d05fdc27-f402-4383-be60-02c7fb718f9a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:01.371681 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.371645 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d05fdc27-f402-4383-be60-02c7fb718f9a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d05fdc27-f402-4383-be60-02c7fb718f9a" (UID: "d05fdc27-f402-4383-be60-02c7fb718f9a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:01.371783 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.371697 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05fdc27-f402-4383-be60-02c7fb718f9a-kube-api-access-bm7hx" (OuterVolumeSpecName: "kube-api-access-bm7hx") pod "d05fdc27-f402-4383-be60-02c7fb718f9a" (UID: "d05fdc27-f402-4383-be60-02c7fb718f9a"). InnerVolumeSpecName "kube-api-access-bm7hx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:01.371783 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.371742 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d05fdc27-f402-4383-be60-02c7fb718f9a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d05fdc27-f402-4383-be60-02c7fb718f9a" (UID: "d05fdc27-f402-4383-be60-02c7fb718f9a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:01.448217 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.448180 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-stzzq\"" Apr 22 18:40:01.456172 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.456151 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bkhdr" Apr 22 18:40:01.470270 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.470243 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bm7hx\" (UniqueName: \"kubernetes.io/projected/d05fdc27-f402-4383-be60-02c7fb718f9a-kube-api-access-bm7hx\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:01.470270 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.470271 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-oauth-serving-cert\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:01.470405 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.470281 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-console-config\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:01.470405 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.470291 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d05fdc27-f402-4383-be60-02c7fb718f9a-service-ca\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:01.470405 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.470299 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d05fdc27-f402-4383-be60-02c7fb718f9a-console-serving-cert\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:01.470405 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.470307 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d05fdc27-f402-4383-be60-02c7fb718f9a-console-oauth-config\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:01.574966 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.574832 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bkhdr"] Apr 22 18:40:01.984155 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.984125 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5499c67d88-dvxcr_d05fdc27-f402-4383-be60-02c7fb718f9a/console/0.log" Apr 22 18:40:01.984332 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.984169 2578 generic.go:358] "Generic (PLEG): container finished" podID="d05fdc27-f402-4383-be60-02c7fb718f9a" containerID="bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5" exitCode=2 Apr 22 18:40:01.984332 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.984256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5499c67d88-dvxcr" event={"ID":"d05fdc27-f402-4383-be60-02c7fb718f9a","Type":"ContainerDied","Data":"bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5"} Apr 22 18:40:01.984332 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.984281 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5499c67d88-dvxcr" Apr 22 18:40:01.984332 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.984291 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5499c67d88-dvxcr" event={"ID":"d05fdc27-f402-4383-be60-02c7fb718f9a","Type":"ContainerDied","Data":"bac2b85064c9bca91ce4bec78c2dd8e810abee8fa90e033b447e72b0a4e757f1"} Apr 22 18:40:01.984332 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.984305 2578 scope.go:117] "RemoveContainer" containerID="bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5" Apr 22 18:40:01.985504 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.985456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bkhdr" event={"ID":"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f","Type":"ContainerStarted","Data":"78ebd0dd057837e04eef11689ae0bacc0dc6852cc8ba0a351725a9d778dd33b9"} Apr 22 18:40:01.991943 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.991925 2578 scope.go:117] "RemoveContainer" containerID="bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5" Apr 22 18:40:01.992239 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:40:01.992210 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5\": container with ID starting with bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5 not found: ID does not exist" containerID="bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5" Apr 22 18:40:01.992288 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:01.992247 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5"} err="failed to get container status \"bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5\": rpc error: code = NotFound desc = could not find container \"bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5\": container with ID starting with bf411eac6be2c5b04293cb6f51017e61c96d6ddfe98198c409f517df43795fb5 not found: ID does not exist" Apr 22 18:40:02.001293 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:02.001265 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5499c67d88-dvxcr"] Apr 22 18:40:02.005792 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:02.005768 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5499c67d88-dvxcr"] Apr 22 18:40:02.991253 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:02.991190 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bkhdr" event={"ID":"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f","Type":"ContainerStarted","Data":"1e37887bc0b9ceca1c47607b2043972e450d4141ee60ed143a06114be92550e4"} Apr 22 18:40:03.539312 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:03.539271 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d05fdc27-f402-4383-be60-02c7fb718f9a" path="/var/lib/kubelet/pods/d05fdc27-f402-4383-be60-02c7fb718f9a/volumes" Apr 22 18:40:03.995823 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:03.995785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bkhdr" event={"ID":"9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f","Type":"ContainerStarted","Data":"5c775a2ad2e30e91044a6c6500464f800a6d45de53192f983acd0f31c98e3ee4"} Apr 22 18:40:04.014263 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:04.014213 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bkhdr" podStartSLOduration=131.761255363 podStartE2EDuration="2m13.014197354s" podCreationTimestamp="2026-04-22 18:37:51 +0000 UTC" firstStartedPulling="2026-04-22 18:40:01.5807268 +0000 UTC m=+130.570993408" lastFinishedPulling="2026-04-22 18:40:02.833668785 +0000 UTC m=+131.823935399" observedRunningTime="2026-04-22 18:40:04.012763773 +0000 UTC m=+133.003030423" watchObservedRunningTime="2026-04-22 18:40:04.014197354 +0000 UTC m=+133.004463984" Apr 22 18:40:17.026663 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.026591 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-777c754768-2vxw5" podUID="cf80386c-cac4-45f7-85c0-e71274afcf40" containerName="console" containerID="cri-o://298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a" gracePeriod=15 Apr 22 18:40:17.253366 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.253345 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-777c754768-2vxw5_cf80386c-cac4-45f7-85c0-e71274afcf40/console/0.log" Apr 22 18:40:17.253500 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.253405 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:40:17.391939 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.391908 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-trusted-ca-bundle\") pod \"cf80386c-cac4-45f7-85c0-e71274afcf40\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " Apr 22 18:40:17.392134 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.391979 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2qll\" (UniqueName: \"kubernetes.io/projected/cf80386c-cac4-45f7-85c0-e71274afcf40-kube-api-access-v2qll\") pod \"cf80386c-cac4-45f7-85c0-e71274afcf40\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " Apr 22 18:40:17.392134 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.392006 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-oauth-serving-cert\") pod \"cf80386c-cac4-45f7-85c0-e71274afcf40\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " Apr 22 18:40:17.392134 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.392058 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf80386c-cac4-45f7-85c0-e71274afcf40-console-oauth-config\") pod \"cf80386c-cac4-45f7-85c0-e71274afcf40\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " Apr 22 18:40:17.392134 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.392101 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-service-ca\") pod \"cf80386c-cac4-45f7-85c0-e71274afcf40\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " Apr 22 18:40:17.392134 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.392126 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf80386c-cac4-45f7-85c0-e71274afcf40-console-serving-cert\") pod \"cf80386c-cac4-45f7-85c0-e71274afcf40\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " Apr 22 18:40:17.392376 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.392155 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-console-config\") pod \"cf80386c-cac4-45f7-85c0-e71274afcf40\" (UID: \"cf80386c-cac4-45f7-85c0-e71274afcf40\") " Apr 22 18:40:17.392451 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.392416 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cf80386c-cac4-45f7-85c0-e71274afcf40" (UID: "cf80386c-cac4-45f7-85c0-e71274afcf40"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:17.392547 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.392441 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cf80386c-cac4-45f7-85c0-e71274afcf40" (UID: "cf80386c-cac4-45f7-85c0-e71274afcf40"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:17.392625 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.392597 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-service-ca" (OuterVolumeSpecName: "service-ca") pod "cf80386c-cac4-45f7-85c0-e71274afcf40" (UID: "cf80386c-cac4-45f7-85c0-e71274afcf40"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:17.392759 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.392663 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-console-config" (OuterVolumeSpecName: "console-config") pod "cf80386c-cac4-45f7-85c0-e71274afcf40" (UID: "cf80386c-cac4-45f7-85c0-e71274afcf40"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:17.394312 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.394279 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf80386c-cac4-45f7-85c0-e71274afcf40-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cf80386c-cac4-45f7-85c0-e71274afcf40" (UID: "cf80386c-cac4-45f7-85c0-e71274afcf40"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:17.394631 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.394607 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf80386c-cac4-45f7-85c0-e71274afcf40-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cf80386c-cac4-45f7-85c0-e71274afcf40" (UID: "cf80386c-cac4-45f7-85c0-e71274afcf40"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:17.394707 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.394637 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf80386c-cac4-45f7-85c0-e71274afcf40-kube-api-access-v2qll" (OuterVolumeSpecName: "kube-api-access-v2qll") pod "cf80386c-cac4-45f7-85c0-e71274afcf40" (UID: "cf80386c-cac4-45f7-85c0-e71274afcf40"). InnerVolumeSpecName "kube-api-access-v2qll". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:17.492635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.492603 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf80386c-cac4-45f7-85c0-e71274afcf40-console-oauth-config\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:17.492635 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.492634 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-service-ca\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:17.492816 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.492648 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf80386c-cac4-45f7-85c0-e71274afcf40-console-serving-cert\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:17.492816 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.492663 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-console-config\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:17.492816 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.492675 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-trusted-ca-bundle\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:17.492816 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.492690 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v2qll\" (UniqueName: \"kubernetes.io/projected/cf80386c-cac4-45f7-85c0-e71274afcf40-kube-api-access-v2qll\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:17.492816 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:17.492705 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf80386c-cac4-45f7-85c0-e71274afcf40-oauth-serving-cert\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:40:18.041722 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:18.041695 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-777c754768-2vxw5_cf80386c-cac4-45f7-85c0-e71274afcf40/console/0.log" Apr 22 18:40:18.042172 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:18.041734 2578 generic.go:358] "Generic (PLEG): container finished" podID="cf80386c-cac4-45f7-85c0-e71274afcf40" containerID="298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a" exitCode=2 Apr 22 18:40:18.042172 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:18.041768 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-777c754768-2vxw5" event={"ID":"cf80386c-cac4-45f7-85c0-e71274afcf40","Type":"ContainerDied","Data":"298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a"} Apr 22 18:40:18.042172 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:18.041820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-777c754768-2vxw5" event={"ID":"cf80386c-cac4-45f7-85c0-e71274afcf40","Type":"ContainerDied","Data":"c86ceb3320dd4d1e6db14a851d9f3701904d70d7b2687ea5826dc7255ee4332b"} Apr 22 18:40:18.042172 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:18.041823 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-777c754768-2vxw5" Apr 22 18:40:18.042172 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:18.041841 2578 scope.go:117] "RemoveContainer" containerID="298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a" Apr 22 18:40:18.050792 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:18.050746 2578 scope.go:117] "RemoveContainer" containerID="298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a" Apr 22 18:40:18.051267 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:40:18.051242 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a\": container with ID starting with 298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a not found: ID does not exist" containerID="298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a" Apr 22 18:40:18.051379 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:18.051270 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a"} err="failed to get container status \"298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a\": rpc error: code = NotFound desc = could not find container \"298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a\": container with ID starting with 298d2ba668adf741d1fffe93bb653c19c6ae9123dd85455f3bd34eb9fa42628a not found: ID does not exist" Apr 22 18:40:18.060938 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:18.060913 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-777c754768-2vxw5"] Apr 22 18:40:18.064642 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:18.064619 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-777c754768-2vxw5"] Apr 22 18:40:19.538225 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:19.538195 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf80386c-cac4-45f7-85c0-e71274afcf40" path="/var/lib/kubelet/pods/cf80386c-cac4-45f7-85c0-e71274afcf40/volumes" Apr 22 18:40:48.871010 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.870923 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fdbf9894c-n795s"] Apr 22 18:40:48.871450 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.871204 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef9d655a-2703-42fd-97e9-84bb1fe30e68" containerName="registry" Apr 22 18:40:48.871450 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.871214 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9d655a-2703-42fd-97e9-84bb1fe30e68" containerName="registry" Apr 22 18:40:48.871450 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.871231 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf80386c-cac4-45f7-85c0-e71274afcf40" containerName="console" Apr 22 18:40:48.871450 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.871236 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf80386c-cac4-45f7-85c0-e71274afcf40" containerName="console" Apr 22 18:40:48.871450 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.871244 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d05fdc27-f402-4383-be60-02c7fb718f9a" containerName="console" Apr 22 18:40:48.871450 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.871249 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05fdc27-f402-4383-be60-02c7fb718f9a" containerName="console" Apr 22 18:40:48.871450 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.871289 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d05fdc27-f402-4383-be60-02c7fb718f9a" containerName="console" Apr 22 18:40:48.871450 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.871299 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef9d655a-2703-42fd-97e9-84bb1fe30e68" containerName="registry" Apr 22 18:40:48.871450 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.871306 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf80386c-cac4-45f7-85c0-e71274afcf40" containerName="console" Apr 22 18:40:48.874161 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.874142 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:48.884224 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.884196 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fdbf9894c-n795s"] Apr 22 18:40:48.915126 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.915088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-oauth-serving-cert\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:48.915126 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.915127 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-service-ca\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:48.915360 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.915146 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78p5t\" (UniqueName: \"kubernetes.io/projected/aa104b99-ce27-4823-9997-a0cac82f7b63-kube-api-access-78p5t\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:48.915360 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.915259 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa104b99-ce27-4823-9997-a0cac82f7b63-console-oauth-config\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:48.915360 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.915295 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa104b99-ce27-4823-9997-a0cac82f7b63-console-serving-cert\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:48.915360 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.915320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-trusted-ca-bundle\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:48.915564 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:48.915386 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-console-config\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.016260 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.016218 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-oauth-serving-cert\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.016260 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.016259 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-service-ca\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.016575 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.016274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78p5t\" (UniqueName: \"kubernetes.io/projected/aa104b99-ce27-4823-9997-a0cac82f7b63-kube-api-access-78p5t\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.016575 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.016349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa104b99-ce27-4823-9997-a0cac82f7b63-console-oauth-config\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.016575 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.016400 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa104b99-ce27-4823-9997-a0cac82f7b63-console-serving-cert\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.016575 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.016446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-trusted-ca-bundle\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.016575 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.016516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-console-config\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.017660 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.017396 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-service-ca\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.017660 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.017496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-console-config\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.017660 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.017491 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-oauth-serving-cert\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.017932 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.017758 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-trusted-ca-bundle\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.021735 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.019650 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa104b99-ce27-4823-9997-a0cac82f7b63-console-oauth-config\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.021735 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.020317 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa104b99-ce27-4823-9997-a0cac82f7b63-console-serving-cert\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.030170 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.030141 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78p5t\" (UniqueName: \"kubernetes.io/projected/aa104b99-ce27-4823-9997-a0cac82f7b63-kube-api-access-78p5t\") pod \"console-5fdbf9894c-n795s\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.183994 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.183954 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:49.308389 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:49.308358 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fdbf9894c-n795s"] Apr 22 18:40:49.311570 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:40:49.311539 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa104b99_ce27_4823_9997_a0cac82f7b63.slice/crio-f30baf6088cb4dff7543877df37815e1c538472d18ac0d4dcd0000d3ec568e05 WatchSource:0}: Error finding container f30baf6088cb4dff7543877df37815e1c538472d18ac0d4dcd0000d3ec568e05: Status 404 returned error can't find the container with id f30baf6088cb4dff7543877df37815e1c538472d18ac0d4dcd0000d3ec568e05 Apr 22 18:40:50.129716 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:50.129677 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdbf9894c-n795s" event={"ID":"aa104b99-ce27-4823-9997-a0cac82f7b63","Type":"ContainerStarted","Data":"2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188"} Apr 22 18:40:50.129716 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:50.129713 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdbf9894c-n795s" event={"ID":"aa104b99-ce27-4823-9997-a0cac82f7b63","Type":"ContainerStarted","Data":"f30baf6088cb4dff7543877df37815e1c538472d18ac0d4dcd0000d3ec568e05"} Apr 22 18:40:50.148131 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:50.148076 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fdbf9894c-n795s" podStartSLOduration=2.148061094 podStartE2EDuration="2.148061094s" podCreationTimestamp="2026-04-22 18:40:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:40:50.146588551 +0000 UTC m=+179.136855180" watchObservedRunningTime="2026-04-22 18:40:50.148061094 +0000 UTC m=+179.138327722" Apr 22 18:40:59.184759 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:59.184719 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:59.185345 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:59.184771 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:40:59.189637 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:40:59.189610 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:41:00.159921 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:00.159893 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:41:00.208329 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:00.208293 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-685b7cdf4f-fdsjf"] Apr 22 18:41:25.227780 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.227743 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-685b7cdf4f-fdsjf" podUID="bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" containerName="console" containerID="cri-o://ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a" gracePeriod=15 Apr 22 18:41:25.466206 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.466185 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-685b7cdf4f-fdsjf_bfd80a83-e929-4ac8-ab34-dea0dd64ef2e/console/0.log" Apr 22 18:41:25.466318 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.466244 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:41:25.582323 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.582235 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-oauth-serving-cert\") pod \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " Apr 22 18:41:25.582323 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.582278 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-serving-cert\") pod \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " Apr 22 18:41:25.582323 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.582305 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2bxl\" (UniqueName: \"kubernetes.io/projected/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-kube-api-access-d2bxl\") pod \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " Apr 22 18:41:25.582323 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.582326 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-service-ca\") pod \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " Apr 22 18:41:25.582702 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.582353 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-config\") pod \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " Apr 22 18:41:25.582702 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.582387 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-trusted-ca-bundle\") pod \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " Apr 22 18:41:25.582702 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.582413 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-oauth-config\") pod \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\" (UID: \"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e\") " Apr 22 18:41:25.582908 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.582785 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-service-ca" (OuterVolumeSpecName: "service-ca") pod "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" (UID: "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:41:25.582908 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.582860 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-config" (OuterVolumeSpecName: "console-config") pod "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" (UID: "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:41:25.583083 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.582946 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" (UID: "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:41:25.583083 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.582961 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" (UID: "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:41:25.584427 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.584402 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" (UID: "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:41:25.584632 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.584615 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" (UID: "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:41:25.584632 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.584624 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-kube-api-access-d2bxl" (OuterVolumeSpecName: "kube-api-access-d2bxl") pod "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" (UID: "bfd80a83-e929-4ac8-ab34-dea0dd64ef2e"). InnerVolumeSpecName "kube-api-access-d2bxl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:41:25.683071 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.683022 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-oauth-serving-cert\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:41:25.683071 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.683065 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-serving-cert\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:41:25.683071 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.683076 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d2bxl\" (UniqueName: \"kubernetes.io/projected/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-kube-api-access-d2bxl\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:41:25.683071 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.683086 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-service-ca\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:41:25.683340 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.683095 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-config\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:41:25.683340 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.683106 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-trusted-ca-bundle\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:41:25.683340 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:25.683115 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e-console-oauth-config\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:41:26.228834 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:26.228808 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-685b7cdf4f-fdsjf_bfd80a83-e929-4ac8-ab34-dea0dd64ef2e/console/0.log" Apr 22 18:41:26.229275 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:26.228848 2578 generic.go:358] "Generic (PLEG): container finished" podID="bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" containerID="ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a" exitCode=2 Apr 22 18:41:26.229275 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:26.228904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-685b7cdf4f-fdsjf" event={"ID":"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e","Type":"ContainerDied","Data":"ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a"} Apr 22 18:41:26.229275 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:26.228915 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-685b7cdf4f-fdsjf" Apr 22 18:41:26.229275 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:26.228935 2578 scope.go:117] "RemoveContainer" containerID="ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a" Apr 22 18:41:26.229275 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:26.228926 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-685b7cdf4f-fdsjf" event={"ID":"bfd80a83-e929-4ac8-ab34-dea0dd64ef2e","Type":"ContainerDied","Data":"26528e2f03f9551afb844e516b9e9c8c690b7ae3e643213fc1f8dcca861af7f4"} Apr 22 18:41:26.237053 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:26.237029 2578 scope.go:117] "RemoveContainer" containerID="ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a" Apr 22 18:41:26.237376 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:41:26.237356 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a\": container with ID starting with ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a not found: ID does not exist" containerID="ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a" Apr 22 18:41:26.237453 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:26.237390 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a"} err="failed to get container status \"ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a\": rpc error: code = NotFound desc = could not find container \"ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a\": container with ID starting with ce362c993999a5ca250745e741af5489df76a29c364d74ed94da7e03b55fa41a not found: ID does not exist" Apr 22 18:41:26.250831 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:26.250797 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-685b7cdf4f-fdsjf"] Apr 22 18:41:26.255004 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:26.254978 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-685b7cdf4f-fdsjf"] Apr 22 18:41:27.538214 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:27.538170 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" path="/var/lib/kubelet/pods/bfd80a83-e929-4ac8-ab34-dea0dd64ef2e/volumes" Apr 22 18:41:37.898299 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:37.898261 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9hghn"] Apr 22 18:41:37.898788 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:37.898544 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" containerName="console" Apr 22 18:41:37.898788 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:37.898556 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" containerName="console" Apr 22 18:41:37.898788 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:37.898608 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bfd80a83-e929-4ac8-ab34-dea0dd64ef2e" containerName="console" Apr 22 18:41:37.902781 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:37.902761 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9hghn" Apr 22 18:41:37.905607 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:37.905583 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:41:37.906752 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:37.906727 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9hghn"] Apr 22 18:41:37.976111 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:37.976053 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cbd68ab5-95f9-489c-a958-928dc293c79a-kubelet-config\") pod \"global-pull-secret-syncer-9hghn\" (UID: \"cbd68ab5-95f9-489c-a958-928dc293c79a\") " pod="kube-system/global-pull-secret-syncer-9hghn" Apr 22 18:41:37.976111 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:37.976106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cbd68ab5-95f9-489c-a958-928dc293c79a-original-pull-secret\") pod \"global-pull-secret-syncer-9hghn\" (UID: \"cbd68ab5-95f9-489c-a958-928dc293c79a\") " pod="kube-system/global-pull-secret-syncer-9hghn" Apr 22 18:41:37.976310 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:37.976196 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cbd68ab5-95f9-489c-a958-928dc293c79a-dbus\") pod \"global-pull-secret-syncer-9hghn\" (UID: \"cbd68ab5-95f9-489c-a958-928dc293c79a\") " pod="kube-system/global-pull-secret-syncer-9hghn" Apr 22 18:41:38.076871 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:38.076832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cbd68ab5-95f9-489c-a958-928dc293c79a-dbus\") pod \"global-pull-secret-syncer-9hghn\" (UID: \"cbd68ab5-95f9-489c-a958-928dc293c79a\") " pod="kube-system/global-pull-secret-syncer-9hghn" Apr 22 18:41:38.076998 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:38.076886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cbd68ab5-95f9-489c-a958-928dc293c79a-kubelet-config\") pod \"global-pull-secret-syncer-9hghn\" (UID: \"cbd68ab5-95f9-489c-a958-928dc293c79a\") " pod="kube-system/global-pull-secret-syncer-9hghn" Apr 22 18:41:38.076998 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:38.076905 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cbd68ab5-95f9-489c-a958-928dc293c79a-original-pull-secret\") pod \"global-pull-secret-syncer-9hghn\" (UID: \"cbd68ab5-95f9-489c-a958-928dc293c79a\") " pod="kube-system/global-pull-secret-syncer-9hghn" Apr 22 18:41:38.077071 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:38.077015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cbd68ab5-95f9-489c-a958-928dc293c79a-kubelet-config\") pod \"global-pull-secret-syncer-9hghn\" (UID: \"cbd68ab5-95f9-489c-a958-928dc293c79a\") " pod="kube-system/global-pull-secret-syncer-9hghn" Apr 22 18:41:38.077071 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:38.077015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cbd68ab5-95f9-489c-a958-928dc293c79a-dbus\") pod \"global-pull-secret-syncer-9hghn\" (UID: \"cbd68ab5-95f9-489c-a958-928dc293c79a\") " pod="kube-system/global-pull-secret-syncer-9hghn" Apr 22 18:41:38.079165 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:38.079145 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cbd68ab5-95f9-489c-a958-928dc293c79a-original-pull-secret\") pod \"global-pull-secret-syncer-9hghn\" (UID: \"cbd68ab5-95f9-489c-a958-928dc293c79a\") " pod="kube-system/global-pull-secret-syncer-9hghn" Apr 22 18:41:38.212588 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:38.212503 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9hghn" Apr 22 18:41:38.332164 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:38.332137 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9hghn"] Apr 22 18:41:38.334853 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:41:38.334827 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbd68ab5_95f9_489c_a958_928dc293c79a.slice/crio-c06e157925c82268c0ed1961764598c189127feefe4ff0bd9b5d8cfe980518bf WatchSource:0}: Error finding container c06e157925c82268c0ed1961764598c189127feefe4ff0bd9b5d8cfe980518bf: Status 404 returned error can't find the container with id c06e157925c82268c0ed1961764598c189127feefe4ff0bd9b5d8cfe980518bf Apr 22 18:41:39.267978 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:39.267942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9hghn" event={"ID":"cbd68ab5-95f9-489c-a958-928dc293c79a","Type":"ContainerStarted","Data":"c06e157925c82268c0ed1961764598c189127feefe4ff0bd9b5d8cfe980518bf"} Apr 22 18:41:43.281246 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:43.281212 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9hghn" event={"ID":"cbd68ab5-95f9-489c-a958-928dc293c79a","Type":"ContainerStarted","Data":"0fa81291054007aae6d41993c6d88d4e2a52f3d813e61862a7c7399bf0961fa1"} Apr 22 18:41:43.299510 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:43.299446 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9hghn" podStartSLOduration=2.381808922 podStartE2EDuration="6.299429092s" podCreationTimestamp="2026-04-22 18:41:37 +0000 UTC" firstStartedPulling="2026-04-22 18:41:38.336683011 +0000 UTC m=+227.326949619" lastFinishedPulling="2026-04-22 18:41:42.254303168 +0000 UTC m=+231.244569789" observedRunningTime="2026-04-22 18:41:43.297680365 +0000 UTC m=+232.287946998" watchObservedRunningTime="2026-04-22 18:41:43.299429092 +0000 UTC m=+232.289695721" Apr 22 18:41:50.173754 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.173718 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2"] Apr 22 18:41:50.177114 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.177091 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2" Apr 22 18:41:50.180603 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.180582 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:41:50.180603 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.180592 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:41:50.180756 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.180583 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:41:50.180756 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.180639 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 18:41:50.180756 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.180659 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-8fbs5\"" Apr 22 18:41:50.186150 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.186125 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2"] Apr 22 18:41:50.261776 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.261736 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdg5q\" (UniqueName: \"kubernetes.io/projected/e5fcd51c-dfc1-4561-ab04-c94ad9afcec4-kube-api-access-vdg5q\") pod \"managed-serviceaccount-addon-agent-6bd5459877-f9pw2\" (UID: \"e5fcd51c-dfc1-4561-ab04-c94ad9afcec4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2" Apr 22 18:41:50.261776 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.261778 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e5fcd51c-dfc1-4561-ab04-c94ad9afcec4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6bd5459877-f9pw2\" (UID: \"e5fcd51c-dfc1-4561-ab04-c94ad9afcec4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2" Apr 22 18:41:50.273299 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.273268 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz"] Apr 22 18:41:50.276341 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.276325 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:41:50.278727 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.278711 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 18:41:50.286967 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.286943 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz"] Apr 22 18:41:50.362657 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.362611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1540ae23-e8a0-497e-acbd-c2d042c25415-tmp\") pod \"klusterlet-addon-workmgr-6bf99b74f4-c9zqz\" (UID: \"1540ae23-e8a0-497e-acbd-c2d042c25415\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:41:50.362833 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.362682 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdg5q\" (UniqueName: \"kubernetes.io/projected/e5fcd51c-dfc1-4561-ab04-c94ad9afcec4-kube-api-access-vdg5q\") pod \"managed-serviceaccount-addon-agent-6bd5459877-f9pw2\" (UID: \"e5fcd51c-dfc1-4561-ab04-c94ad9afcec4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2" Apr 22 18:41:50.362833 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.362722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e5fcd51c-dfc1-4561-ab04-c94ad9afcec4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6bd5459877-f9pw2\" (UID: \"e5fcd51c-dfc1-4561-ab04-c94ad9afcec4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2" Apr 22 18:41:50.362833 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.362749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dltrl\" (UniqueName: \"kubernetes.io/projected/1540ae23-e8a0-497e-acbd-c2d042c25415-kube-api-access-dltrl\") pod \"klusterlet-addon-workmgr-6bf99b74f4-c9zqz\" (UID: \"1540ae23-e8a0-497e-acbd-c2d042c25415\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:41:50.362833 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.362816 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1540ae23-e8a0-497e-acbd-c2d042c25415-klusterlet-config\") pod \"klusterlet-addon-workmgr-6bf99b74f4-c9zqz\" (UID: \"1540ae23-e8a0-497e-acbd-c2d042c25415\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:41:50.365227 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.365195 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e5fcd51c-dfc1-4561-ab04-c94ad9afcec4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6bd5459877-f9pw2\" (UID: \"e5fcd51c-dfc1-4561-ab04-c94ad9afcec4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2" Apr 22 18:41:50.372102 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.372077 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdg5q\" (UniqueName: \"kubernetes.io/projected/e5fcd51c-dfc1-4561-ab04-c94ad9afcec4-kube-api-access-vdg5q\") pod \"managed-serviceaccount-addon-agent-6bd5459877-f9pw2\" (UID: \"e5fcd51c-dfc1-4561-ab04-c94ad9afcec4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2" Apr 22 18:41:50.463513 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.463397 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1540ae23-e8a0-497e-acbd-c2d042c25415-klusterlet-config\") pod \"klusterlet-addon-workmgr-6bf99b74f4-c9zqz\" (UID: \"1540ae23-e8a0-497e-acbd-c2d042c25415\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:41:50.463513 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.463451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1540ae23-e8a0-497e-acbd-c2d042c25415-tmp\") pod \"klusterlet-addon-workmgr-6bf99b74f4-c9zqz\" (UID: \"1540ae23-e8a0-497e-acbd-c2d042c25415\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:41:50.463690 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.463639 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dltrl\" (UniqueName: \"kubernetes.io/projected/1540ae23-e8a0-497e-acbd-c2d042c25415-kube-api-access-dltrl\") pod \"klusterlet-addon-workmgr-6bf99b74f4-c9zqz\" (UID: \"1540ae23-e8a0-497e-acbd-c2d042c25415\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:41:50.463840 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.463821 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1540ae23-e8a0-497e-acbd-c2d042c25415-tmp\") pod \"klusterlet-addon-workmgr-6bf99b74f4-c9zqz\" (UID: \"1540ae23-e8a0-497e-acbd-c2d042c25415\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:41:50.465905 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.465878 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1540ae23-e8a0-497e-acbd-c2d042c25415-klusterlet-config\") pod \"klusterlet-addon-workmgr-6bf99b74f4-c9zqz\" (UID: \"1540ae23-e8a0-497e-acbd-c2d042c25415\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:41:50.472989 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.472963 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dltrl\" (UniqueName: \"kubernetes.io/projected/1540ae23-e8a0-497e-acbd-c2d042c25415-kube-api-access-dltrl\") pod \"klusterlet-addon-workmgr-6bf99b74f4-c9zqz\" (UID: \"1540ae23-e8a0-497e-acbd-c2d042c25415\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:41:50.494978 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.494950 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2" Apr 22 18:41:50.586981 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.586949 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:41:50.611854 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.611793 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2"] Apr 22 18:41:50.614430 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:41:50.614396 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5fcd51c_dfc1_4561_ab04_c94ad9afcec4.slice/crio-d853f914c71aded29943673d2f54a7acbfae01e09d809a8536a64dcc2d7cdfc0 WatchSource:0}: Error finding container d853f914c71aded29943673d2f54a7acbfae01e09d809a8536a64dcc2d7cdfc0: Status 404 returned error can't find the container with id d853f914c71aded29943673d2f54a7acbfae01e09d809a8536a64dcc2d7cdfc0 Apr 22 18:41:50.704181 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:50.704157 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz"] Apr 22 18:41:50.706433 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:41:50.706408 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1540ae23_e8a0_497e_acbd_c2d042c25415.slice/crio-1cb4f831e4995fc622ec64bae047104beffb721df607a4868ef7caeb1605ef79 WatchSource:0}: Error finding container 1cb4f831e4995fc622ec64bae047104beffb721df607a4868ef7caeb1605ef79: Status 404 returned error can't find the container with id 1cb4f831e4995fc622ec64bae047104beffb721df607a4868ef7caeb1605ef79 Apr 22 18:41:51.304512 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:51.304475 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" event={"ID":"1540ae23-e8a0-497e-acbd-c2d042c25415","Type":"ContainerStarted","Data":"1cb4f831e4995fc622ec64bae047104beffb721df607a4868ef7caeb1605ef79"} Apr 22 18:41:51.305373 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:51.305348 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2" event={"ID":"e5fcd51c-dfc1-4561-ab04-c94ad9afcec4","Type":"ContainerStarted","Data":"d853f914c71aded29943673d2f54a7acbfae01e09d809a8536a64dcc2d7cdfc0"} Apr 22 18:41:54.314863 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:54.314829 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2" event={"ID":"e5fcd51c-dfc1-4561-ab04-c94ad9afcec4","Type":"ContainerStarted","Data":"7f76cf6a5aeeb70579da5aca4efc9047f1886df2f895d36839071388af6a3e7b"} Apr 22 18:41:54.332123 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:41:54.332073 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bd5459877-f9pw2" podStartSLOduration=1.3015850709999999 podStartE2EDuration="4.33205947s" podCreationTimestamp="2026-04-22 18:41:50 +0000 UTC" firstStartedPulling="2026-04-22 18:41:50.616168755 +0000 UTC m=+239.606435362" lastFinishedPulling="2026-04-22 18:41:53.646643151 +0000 UTC m=+242.636909761" observedRunningTime="2026-04-22 18:41:54.331760473 +0000 UTC m=+243.322027104" watchObservedRunningTime="2026-04-22 18:41:54.33205947 +0000 UTC m=+243.322326098" Apr 22 18:42:06.351411 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:06.351369 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" event={"ID":"1540ae23-e8a0-497e-acbd-c2d042c25415","Type":"ContainerStarted","Data":"2e742a5adc8fdb039b498a16bacb5b1de48dcbeb42f6962206bd8b7014e0af7a"} Apr 22 18:42:06.351913 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:06.351583 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:42:06.353179 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:06.353157 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" Apr 22 18:42:06.370587 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:06.370541 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6bf99b74f4-c9zqz" podStartSLOduration=0.864978295 podStartE2EDuration="16.370530043s" podCreationTimestamp="2026-04-22 18:41:50 +0000 UTC" firstStartedPulling="2026-04-22 18:41:50.708067873 +0000 UTC m=+239.698334480" lastFinishedPulling="2026-04-22 18:42:06.213619617 +0000 UTC m=+255.203886228" observedRunningTime="2026-04-22 18:42:06.369381721 +0000 UTC m=+255.359648350" watchObservedRunningTime="2026-04-22 18:42:06.370530043 +0000 UTC m=+255.360796671" Apr 22 18:42:17.241306 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.241220 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t"] Apr 22 18:42:17.245060 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.245037 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" Apr 22 18:42:17.247869 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.247846 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-j9ldg\"" Apr 22 18:42:17.248032 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.247854 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:42:17.248083 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.248060 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:42:17.248159 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.248146 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:42:17.256087 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.256067 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t"] Apr 22 18:42:17.266768 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.266742 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0b1972e8-a66f-4c49-9b44-a3dbd37c833a-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-gt49t\" (UID: \"0b1972e8-a66f-4c49-9b44-a3dbd37c833a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" Apr 22 18:42:17.266883 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.266782 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqxc8\" (UniqueName: \"kubernetes.io/projected/0b1972e8-a66f-4c49-9b44-a3dbd37c833a-kube-api-access-kqxc8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-gt49t\" (UID: \"0b1972e8-a66f-4c49-9b44-a3dbd37c833a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" Apr 22 18:42:17.367597 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.367567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0b1972e8-a66f-4c49-9b44-a3dbd37c833a-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-gt49t\" (UID: \"0b1972e8-a66f-4c49-9b44-a3dbd37c833a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" Apr 22 18:42:17.367744 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.367614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqxc8\" (UniqueName: \"kubernetes.io/projected/0b1972e8-a66f-4c49-9b44-a3dbd37c833a-kube-api-access-kqxc8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-gt49t\" (UID: \"0b1972e8-a66f-4c49-9b44-a3dbd37c833a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" Apr 22 18:42:17.369997 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.369979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0b1972e8-a66f-4c49-9b44-a3dbd37c833a-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-gt49t\" (UID: \"0b1972e8-a66f-4c49-9b44-a3dbd37c833a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" Apr 22 18:42:17.376614 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.376593 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqxc8\" (UniqueName: \"kubernetes.io/projected/0b1972e8-a66f-4c49-9b44-a3dbd37c833a-kube-api-access-kqxc8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-gt49t\" (UID: \"0b1972e8-a66f-4c49-9b44-a3dbd37c833a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" Apr 22 18:42:17.554709 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.554629 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" Apr 22 18:42:17.686181 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:17.686102 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t"] Apr 22 18:42:17.688614 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:42:17.688585 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b1972e8_a66f_4c49_9b44_a3dbd37c833a.slice/crio-959b4a24ef233aeb8b51587210260285fa149d1de69d43c2860a7212a14af335 WatchSource:0}: Error finding container 959b4a24ef233aeb8b51587210260285fa149d1de69d43c2860a7212a14af335: Status 404 returned error can't find the container with id 959b4a24ef233aeb8b51587210260285fa149d1de69d43c2860a7212a14af335 Apr 22 18:42:18.383092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:18.383053 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" event={"ID":"0b1972e8-a66f-4c49-9b44-a3dbd37c833a","Type":"ContainerStarted","Data":"959b4a24ef233aeb8b51587210260285fa149d1de69d43c2860a7212a14af335"} Apr 22 18:42:22.396335 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.396295 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" event={"ID":"0b1972e8-a66f-4c49-9b44-a3dbd37c833a","Type":"ContainerStarted","Data":"5c1204e27c0d59ac5503c5d2a60faa0136eb853cfd69afc08985b74cbaceeee8"} Apr 22 18:42:22.396824 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.396430 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" Apr 22 18:42:22.423018 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.420142 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" podStartSLOduration=1.499453957 podStartE2EDuration="5.420117954s" podCreationTimestamp="2026-04-22 18:42:17 +0000 UTC" firstStartedPulling="2026-04-22 18:42:17.690301248 +0000 UTC m=+266.680567855" lastFinishedPulling="2026-04-22 18:42:21.610965231 +0000 UTC m=+270.601231852" observedRunningTime="2026-04-22 18:42:22.417932918 +0000 UTC m=+271.408199546" watchObservedRunningTime="2026-04-22 18:42:22.420117954 +0000 UTC m=+271.410384586" Apr 22 18:42:22.575132 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.575095 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr"] Apr 22 18:42:22.578322 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.578301 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:22.580728 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.580709 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-w85zh\"" Apr 22 18:42:22.580837 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.580760 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:42:22.580837 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.580750 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 18:42:22.586092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.586070 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr"] Apr 22 18:42:22.607997 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.607959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw8kr\" (UniqueName: \"kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-kube-api-access-hw8kr\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:22.608247 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.608224 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/dc400564-ecfa-45cd-8c60-9eb143f5d55b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:22.608336 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.608324 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:22.709708 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.709622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw8kr\" (UniqueName: \"kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-kube-api-access-hw8kr\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:22.709708 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.709671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/dc400564-ecfa-45cd-8c60-9eb143f5d55b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:22.709708 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.709702 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:22.709927 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:22.709779 2578 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:42:22.709927 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:22.709790 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:42:22.709927 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:22.709807 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr: references non-existent secret key: tls.crt Apr 22 18:42:22.709927 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:22.709865 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates podName:dc400564-ecfa-45cd-8c60-9eb143f5d55b nodeName:}" failed. No retries permitted until 2026-04-22 18:42:23.20984913 +0000 UTC m=+272.200115737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates") pod "keda-metrics-apiserver-7c9f485588-pj6sr" (UID: "dc400564-ecfa-45cd-8c60-9eb143f5d55b") : references non-existent secret key: tls.crt Apr 22 18:42:22.710075 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.710053 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/dc400564-ecfa-45cd-8c60-9eb143f5d55b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:22.720305 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:22.720276 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw8kr\" (UniqueName: \"kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-kube-api-access-hw8kr\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:23.215048 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:23.215014 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:23.215266 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:23.215182 2578 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:42:23.215266 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:23.215205 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:42:23.215266 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:23.215230 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr: references non-existent secret key: tls.crt Apr 22 18:42:23.215436 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:23.215311 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates podName:dc400564-ecfa-45cd-8c60-9eb143f5d55b nodeName:}" failed. No retries permitted until 2026-04-22 18:42:24.21529022 +0000 UTC m=+273.205556846 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates") pod "keda-metrics-apiserver-7c9f485588-pj6sr" (UID: "dc400564-ecfa-45cd-8c60-9eb143f5d55b") : references non-existent secret key: tls.crt Apr 22 18:42:24.224225 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:24.224182 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:24.224648 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:24.224366 2578 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:42:24.224648 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:24.224386 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:42:24.224648 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:24.224404 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr: references non-existent secret key: tls.crt Apr 22 18:42:24.224648 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:24.224498 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates podName:dc400564-ecfa-45cd-8c60-9eb143f5d55b nodeName:}" failed. No retries permitted until 2026-04-22 18:42:26.224484334 +0000 UTC m=+275.214750940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates") pod "keda-metrics-apiserver-7c9f485588-pj6sr" (UID: "dc400564-ecfa-45cd-8c60-9eb143f5d55b") : references non-existent secret key: tls.crt Apr 22 18:42:26.242432 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:26.242385 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:26.242842 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:26.242586 2578 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:42:26.242842 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:26.242615 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:42:26.242842 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:26.242639 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr: references non-existent secret key: tls.crt Apr 22 18:42:26.242842 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:42:26.242715 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates podName:dc400564-ecfa-45cd-8c60-9eb143f5d55b nodeName:}" failed. No retries permitted until 2026-04-22 18:42:30.24269566 +0000 UTC m=+279.232962272 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates") pod "keda-metrics-apiserver-7c9f485588-pj6sr" (UID: "dc400564-ecfa-45cd-8c60-9eb143f5d55b") : references non-existent secret key: tls.crt Apr 22 18:42:30.275736 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:30.275694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:30.278203 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:30.278181 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc400564-ecfa-45cd-8c60-9eb143f5d55b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pj6sr\" (UID: \"dc400564-ecfa-45cd-8c60-9eb143f5d55b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:30.390223 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:30.390186 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:30.509171 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:30.509132 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr"] Apr 22 18:42:30.511561 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:42:30.511533 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc400564_ecfa_45cd_8c60_9eb143f5d55b.slice/crio-5955067313b59f4d16e0f5a6fbd158071427cf9ef85c8e935f0023a89d5f2814 WatchSource:0}: Error finding container 5955067313b59f4d16e0f5a6fbd158071427cf9ef85c8e935f0023a89d5f2814: Status 404 returned error can't find the container with id 5955067313b59f4d16e0f5a6fbd158071427cf9ef85c8e935f0023a89d5f2814 Apr 22 18:42:31.423126 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:31.423081 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" event={"ID":"dc400564-ecfa-45cd-8c60-9eb143f5d55b","Type":"ContainerStarted","Data":"5955067313b59f4d16e0f5a6fbd158071427cf9ef85c8e935f0023a89d5f2814"} Apr 22 18:42:33.430081 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:33.430046 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" event={"ID":"dc400564-ecfa-45cd-8c60-9eb143f5d55b","Type":"ContainerStarted","Data":"ffc60dc0408653c9b96c6272aef1e7fdd492885b75c0a3f70284cc40ce3c8582"} Apr 22 18:42:33.430580 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:33.430157 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:33.447510 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:33.447439 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" podStartSLOduration=9.03719458 podStartE2EDuration="11.447426265s" podCreationTimestamp="2026-04-22 18:42:22 +0000 UTC" firstStartedPulling="2026-04-22 18:42:30.512834442 +0000 UTC m=+279.503101048" lastFinishedPulling="2026-04-22 18:42:32.923066122 +0000 UTC m=+281.913332733" observedRunningTime="2026-04-22 18:42:33.446003712 +0000 UTC m=+282.436270342" watchObservedRunningTime="2026-04-22 18:42:33.447426265 +0000 UTC m=+282.437692894" Apr 22 18:42:43.401904 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:43.401868 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-gt49t" Apr 22 18:42:44.438177 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:44.438146 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pj6sr" Apr 22 18:42:51.426248 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:51.426208 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 18:42:51.427293 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:51.427269 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 18:42:51.431894 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:42:51.431868 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:43:35.855195 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:35.855157 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-5x5c2"] Apr 22 18:43:35.858253 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:35.858237 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" Apr 22 18:43:35.860822 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:35.860800 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 18:43:35.861715 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:35.861687 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 18:43:35.861827 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:35.861687 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-shfjh\"" Apr 22 18:43:35.867420 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:35.867401 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-5x5c2"] Apr 22 18:43:35.953650 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:35.953607 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59dt5\" (UniqueName: \"kubernetes.io/projected/dbea9839-fab8-4ebe-9499-360b8dac959b-kube-api-access-59dt5\") pod \"cert-manager-webhook-587ccfb98-5x5c2\" (UID: \"dbea9839-fab8-4ebe-9499-360b8dac959b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" Apr 22 18:43:35.953822 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:35.953673 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbea9839-fab8-4ebe-9499-360b8dac959b-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-5x5c2\" (UID: \"dbea9839-fab8-4ebe-9499-360b8dac959b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" Apr 22 18:43:36.054577 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:36.054543 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbea9839-fab8-4ebe-9499-360b8dac959b-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-5x5c2\" (UID: \"dbea9839-fab8-4ebe-9499-360b8dac959b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" Apr 22 18:43:36.054731 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:36.054594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59dt5\" (UniqueName: \"kubernetes.io/projected/dbea9839-fab8-4ebe-9499-360b8dac959b-kube-api-access-59dt5\") pod \"cert-manager-webhook-587ccfb98-5x5c2\" (UID: \"dbea9839-fab8-4ebe-9499-360b8dac959b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" Apr 22 18:43:36.064921 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:36.064891 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbea9839-fab8-4ebe-9499-360b8dac959b-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-5x5c2\" (UID: \"dbea9839-fab8-4ebe-9499-360b8dac959b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" Apr 22 18:43:36.065078 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:36.065008 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59dt5\" (UniqueName: \"kubernetes.io/projected/dbea9839-fab8-4ebe-9499-360b8dac959b-kube-api-access-59dt5\") pod \"cert-manager-webhook-587ccfb98-5x5c2\" (UID: \"dbea9839-fab8-4ebe-9499-360b8dac959b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" Apr 22 18:43:36.168027 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:36.167996 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" Apr 22 18:43:36.294730 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:36.294684 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-5x5c2"] Apr 22 18:43:36.296966 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:43:36.296936 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbea9839_fab8_4ebe_9499_360b8dac959b.slice/crio-bfe28a594626648a651e9679b4d950b9f78106719418bed6febc1b0252e706dd WatchSource:0}: Error finding container bfe28a594626648a651e9679b4d950b9f78106719418bed6febc1b0252e706dd: Status 404 returned error can't find the container with id bfe28a594626648a651e9679b4d950b9f78106719418bed6febc1b0252e706dd Apr 22 18:43:36.298632 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:36.298610 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:43:36.599002 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:36.598918 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" event={"ID":"dbea9839-fab8-4ebe-9499-360b8dac959b","Type":"ContainerStarted","Data":"bfe28a594626648a651e9679b4d950b9f78106719418bed6febc1b0252e706dd"} Apr 22 18:43:39.608360 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:39.608323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" event={"ID":"dbea9839-fab8-4ebe-9499-360b8dac959b","Type":"ContainerStarted","Data":"aeb223462a135bbeeb8b1e21213a4c58a80f89ace783cb5af15a1e64ace6a300"} Apr 22 18:43:39.608846 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:39.608377 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" Apr 22 18:43:39.626457 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:39.626379 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" podStartSLOduration=1.8163326450000001 podStartE2EDuration="4.626362078s" podCreationTimestamp="2026-04-22 18:43:35 +0000 UTC" firstStartedPulling="2026-04-22 18:43:36.298739067 +0000 UTC m=+345.289005673" lastFinishedPulling="2026-04-22 18:43:39.1087685 +0000 UTC m=+348.099035106" observedRunningTime="2026-04-22 18:43:39.625842863 +0000 UTC m=+348.616109494" watchObservedRunningTime="2026-04-22 18:43:39.626362078 +0000 UTC m=+348.616628709" Apr 22 18:43:45.612489 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:43:45.612433 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-5x5c2" Apr 22 18:44:18.761356 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.761326 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm"] Apr 22 18:44:18.765211 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.765192 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:18.770139 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.770107 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:44:18.770817 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.770784 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:44:18.770817 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.770801 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 18:44:18.770987 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.770792 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 18:44:18.771187 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.771170 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 18:44:18.771292 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.771277 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-cpc6t\"" Apr 22 18:44:18.784069 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.784047 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm"] Apr 22 18:44:18.869878 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.869841 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f2c5a268-5d72-4c28-a3bc-043441856d64-manager-config\") pod \"lws-controller-manager-59b94d4c58-q8sqm\" (UID: \"f2c5a268-5d72-4c28-a3bc-043441856d64\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:18.870062 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.869898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2c5a268-5d72-4c28-a3bc-043441856d64-cert\") pod \"lws-controller-manager-59b94d4c58-q8sqm\" (UID: \"f2c5a268-5d72-4c28-a3bc-043441856d64\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:18.870062 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.869921 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f2c5a268-5d72-4c28-a3bc-043441856d64-metrics-cert\") pod \"lws-controller-manager-59b94d4c58-q8sqm\" (UID: \"f2c5a268-5d72-4c28-a3bc-043441856d64\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:18.870062 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.869943 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96j4f\" (UniqueName: \"kubernetes.io/projected/f2c5a268-5d72-4c28-a3bc-043441856d64-kube-api-access-96j4f\") pod \"lws-controller-manager-59b94d4c58-q8sqm\" (UID: \"f2c5a268-5d72-4c28-a3bc-043441856d64\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:18.970731 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.970689 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f2c5a268-5d72-4c28-a3bc-043441856d64-manager-config\") pod \"lws-controller-manager-59b94d4c58-q8sqm\" (UID: \"f2c5a268-5d72-4c28-a3bc-043441856d64\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:18.970905 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.970777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2c5a268-5d72-4c28-a3bc-043441856d64-cert\") pod \"lws-controller-manager-59b94d4c58-q8sqm\" (UID: \"f2c5a268-5d72-4c28-a3bc-043441856d64\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:18.970905 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.970809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f2c5a268-5d72-4c28-a3bc-043441856d64-metrics-cert\") pod \"lws-controller-manager-59b94d4c58-q8sqm\" (UID: \"f2c5a268-5d72-4c28-a3bc-043441856d64\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:18.970905 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.970842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96j4f\" (UniqueName: \"kubernetes.io/projected/f2c5a268-5d72-4c28-a3bc-043441856d64-kube-api-access-96j4f\") pod \"lws-controller-manager-59b94d4c58-q8sqm\" (UID: \"f2c5a268-5d72-4c28-a3bc-043441856d64\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:18.971524 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.971503 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f2c5a268-5d72-4c28-a3bc-043441856d64-manager-config\") pod \"lws-controller-manager-59b94d4c58-q8sqm\" (UID: \"f2c5a268-5d72-4c28-a3bc-043441856d64\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:18.973324 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.973299 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2c5a268-5d72-4c28-a3bc-043441856d64-cert\") pod \"lws-controller-manager-59b94d4c58-q8sqm\" (UID: \"f2c5a268-5d72-4c28-a3bc-043441856d64\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:18.973416 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.973355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f2c5a268-5d72-4c28-a3bc-043441856d64-metrics-cert\") pod \"lws-controller-manager-59b94d4c58-q8sqm\" (UID: \"f2c5a268-5d72-4c28-a3bc-043441856d64\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:18.983999 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:18.983973 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96j4f\" (UniqueName: \"kubernetes.io/projected/f2c5a268-5d72-4c28-a3bc-043441856d64-kube-api-access-96j4f\") pod \"lws-controller-manager-59b94d4c58-q8sqm\" (UID: \"f2c5a268-5d72-4c28-a3bc-043441856d64\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:19.074575 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:19.074477 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:19.211654 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:19.211621 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm"] Apr 22 18:44:19.215489 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:44:19.215443 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2c5a268_5d72_4c28_a3bc_043441856d64.slice/crio-5bf87a5e86372c9852dc04a69c489ae973f1e86c6723cb5d3209cf42cf83208f WatchSource:0}: Error finding container 5bf87a5e86372c9852dc04a69c489ae973f1e86c6723cb5d3209cf42cf83208f: Status 404 returned error can't find the container with id 5bf87a5e86372c9852dc04a69c489ae973f1e86c6723cb5d3209cf42cf83208f Apr 22 18:44:19.719490 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:19.719434 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" event={"ID":"f2c5a268-5d72-4c28-a3bc-043441856d64","Type":"ContainerStarted","Data":"5bf87a5e86372c9852dc04a69c489ae973f1e86c6723cb5d3209cf42cf83208f"} Apr 22 18:44:22.731529 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:22.731492 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" event={"ID":"f2c5a268-5d72-4c28-a3bc-043441856d64","Type":"ContainerStarted","Data":"fce5d935fb5dcb48b3bb23a6f3721524d9de353fe584dd01f5555ebef80251ec"} Apr 22 18:44:22.731959 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:22.731656 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:44:22.751863 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:22.751817 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" podStartSLOduration=2.256565255 podStartE2EDuration="4.751805988s" podCreationTimestamp="2026-04-22 18:44:18 +0000 UTC" firstStartedPulling="2026-04-22 18:44:19.217146567 +0000 UTC m=+388.207413174" lastFinishedPulling="2026-04-22 18:44:21.712387297 +0000 UTC m=+390.702653907" observedRunningTime="2026-04-22 18:44:22.750479161 +0000 UTC m=+391.740745781" watchObservedRunningTime="2026-04-22 18:44:22.751805988 +0000 UTC m=+391.742072617" Apr 22 18:44:33.736922 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:44:33.736892 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-q8sqm" Apr 22 18:45:04.587385 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:04.587347 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fdbf9894c-n795s"] Apr 22 18:45:29.606513 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.606434 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5fdbf9894c-n795s" podUID="aa104b99-ce27-4823-9997-a0cac82f7b63" containerName="console" containerID="cri-o://2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188" gracePeriod=15 Apr 22 18:45:29.841499 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.841473 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fdbf9894c-n795s_aa104b99-ce27-4823-9997-a0cac82f7b63/console/0.log" Apr 22 18:45:29.841646 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.841534 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:45:29.919474 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.919436 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-service-ca\") pod \"aa104b99-ce27-4823-9997-a0cac82f7b63\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " Apr 22 18:45:29.919666 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.919503 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa104b99-ce27-4823-9997-a0cac82f7b63-console-oauth-config\") pod \"aa104b99-ce27-4823-9997-a0cac82f7b63\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " Apr 22 18:45:29.919666 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.919540 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-oauth-serving-cert\") pod \"aa104b99-ce27-4823-9997-a0cac82f7b63\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " Apr 22 18:45:29.919666 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.919566 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-trusted-ca-bundle\") pod \"aa104b99-ce27-4823-9997-a0cac82f7b63\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " Apr 22 18:45:29.919666 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.919621 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78p5t\" (UniqueName: \"kubernetes.io/projected/aa104b99-ce27-4823-9997-a0cac82f7b63-kube-api-access-78p5t\") pod \"aa104b99-ce27-4823-9997-a0cac82f7b63\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " Apr 22 18:45:29.919666 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.919643 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa104b99-ce27-4823-9997-a0cac82f7b63-console-serving-cert\") pod \"aa104b99-ce27-4823-9997-a0cac82f7b63\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " Apr 22 18:45:29.919918 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.919683 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-console-config\") pod \"aa104b99-ce27-4823-9997-a0cac82f7b63\" (UID: \"aa104b99-ce27-4823-9997-a0cac82f7b63\") " Apr 22 18:45:29.920042 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.919969 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-service-ca" (OuterVolumeSpecName: "service-ca") pod "aa104b99-ce27-4823-9997-a0cac82f7b63" (UID: "aa104b99-ce27-4823-9997-a0cac82f7b63"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:45:29.920101 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.920069 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aa104b99-ce27-4823-9997-a0cac82f7b63" (UID: "aa104b99-ce27-4823-9997-a0cac82f7b63"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:45:29.920101 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.920078 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "aa104b99-ce27-4823-9997-a0cac82f7b63" (UID: "aa104b99-ce27-4823-9997-a0cac82f7b63"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:45:29.920176 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.920165 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-console-config" (OuterVolumeSpecName: "console-config") pod "aa104b99-ce27-4823-9997-a0cac82f7b63" (UID: "aa104b99-ce27-4823-9997-a0cac82f7b63"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:45:29.921996 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.921957 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa104b99-ce27-4823-9997-a0cac82f7b63-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aa104b99-ce27-4823-9997-a0cac82f7b63" (UID: "aa104b99-ce27-4823-9997-a0cac82f7b63"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:45:29.922121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.922010 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa104b99-ce27-4823-9997-a0cac82f7b63-kube-api-access-78p5t" (OuterVolumeSpecName: "kube-api-access-78p5t") pod "aa104b99-ce27-4823-9997-a0cac82f7b63" (UID: "aa104b99-ce27-4823-9997-a0cac82f7b63"). InnerVolumeSpecName "kube-api-access-78p5t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:45:29.922121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.922015 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa104b99-ce27-4823-9997-a0cac82f7b63-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aa104b99-ce27-4823-9997-a0cac82f7b63" (UID: "aa104b99-ce27-4823-9997-a0cac82f7b63"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:45:29.941959 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.941932 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fdbf9894c-n795s_aa104b99-ce27-4823-9997-a0cac82f7b63/console/0.log" Apr 22 18:45:29.942143 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.941976 2578 generic.go:358] "Generic (PLEG): container finished" podID="aa104b99-ce27-4823-9997-a0cac82f7b63" containerID="2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188" exitCode=2 Apr 22 18:45:29.942143 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.942030 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdbf9894c-n795s" event={"ID":"aa104b99-ce27-4823-9997-a0cac82f7b63","Type":"ContainerDied","Data":"2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188"} Apr 22 18:45:29.942143 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.942045 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdbf9894c-n795s" Apr 22 18:45:29.942143 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.942063 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdbf9894c-n795s" event={"ID":"aa104b99-ce27-4823-9997-a0cac82f7b63","Type":"ContainerDied","Data":"f30baf6088cb4dff7543877df37815e1c538472d18ac0d4dcd0000d3ec568e05"} Apr 22 18:45:29.942143 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.942084 2578 scope.go:117] "RemoveContainer" containerID="2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188" Apr 22 18:45:29.950268 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.950249 2578 scope.go:117] "RemoveContainer" containerID="2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188" Apr 22 18:45:29.950544 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:45:29.950523 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188\": container with ID starting with 2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188 not found: ID does not exist" containerID="2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188" Apr 22 18:45:29.950595 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.950552 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188"} err="failed to get container status \"2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188\": rpc error: code = NotFound desc = could not find container \"2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188\": container with ID starting with 2e25116a387f5644c9fbf2ce90ef7d4df72efa9ba8a53fa44b0d19ea6b16e188 not found: ID does not exist" Apr 22 18:45:29.963602 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.963570 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fdbf9894c-n795s"] Apr 22 18:45:29.967132 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:29.967101 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5fdbf9894c-n795s"] Apr 22 18:45:30.020668 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:30.020633 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-console-config\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:45:30.020668 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:30.020663 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-service-ca\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:45:30.020668 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:30.020672 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa104b99-ce27-4823-9997-a0cac82f7b63-console-oauth-config\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:45:30.020848 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:30.020682 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-oauth-serving-cert\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:45:30.020848 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:30.020692 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa104b99-ce27-4823-9997-a0cac82f7b63-trusted-ca-bundle\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:45:30.020848 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:30.020700 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-78p5t\" (UniqueName: \"kubernetes.io/projected/aa104b99-ce27-4823-9997-a0cac82f7b63-kube-api-access-78p5t\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:45:30.020848 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:30.020709 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa104b99-ce27-4823-9997-a0cac82f7b63-console-serving-cert\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:45:31.199110 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.199074 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq"] Apr 22 18:45:31.199516 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.199376 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa104b99-ce27-4823-9997-a0cac82f7b63" containerName="console" Apr 22 18:45:31.199516 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.199387 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa104b99-ce27-4823-9997-a0cac82f7b63" containerName="console" Apr 22 18:45:31.199516 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.199440 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa104b99-ce27-4823-9997-a0cac82f7b63" containerName="console" Apr 22 18:45:31.203586 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.203567 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" Apr 22 18:45:31.206128 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.206102 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 18:45:31.206265 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.206149 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:45:31.207206 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.207190 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 18:45:31.207273 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.207192 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:45:31.207273 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.207219 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-rwc46\"" Apr 22 18:45:31.210768 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.210667 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq"] Apr 22 18:45:31.328692 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.328651 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e94feca-ca43-412c-b1b4-23c6c48170b7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ddglq\" (UID: \"6e94feca-ca43-412c-b1b4-23c6c48170b7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" Apr 22 18:45:31.328869 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.328697 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfvtf\" (UniqueName: \"kubernetes.io/projected/6e94feca-ca43-412c-b1b4-23c6c48170b7-kube-api-access-xfvtf\") pod \"kuadrant-console-plugin-6c886788f8-ddglq\" (UID: \"6e94feca-ca43-412c-b1b4-23c6c48170b7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" Apr 22 18:45:31.328869 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.328748 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6e94feca-ca43-412c-b1b4-23c6c48170b7-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ddglq\" (UID: \"6e94feca-ca43-412c-b1b4-23c6c48170b7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" Apr 22 18:45:31.430063 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.430025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6e94feca-ca43-412c-b1b4-23c6c48170b7-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ddglq\" (UID: \"6e94feca-ca43-412c-b1b4-23c6c48170b7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" Apr 22 18:45:31.430222 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.430106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e94feca-ca43-412c-b1b4-23c6c48170b7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ddglq\" (UID: \"6e94feca-ca43-412c-b1b4-23c6c48170b7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" Apr 22 18:45:31.430222 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.430132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfvtf\" (UniqueName: \"kubernetes.io/projected/6e94feca-ca43-412c-b1b4-23c6c48170b7-kube-api-access-xfvtf\") pod \"kuadrant-console-plugin-6c886788f8-ddglq\" (UID: \"6e94feca-ca43-412c-b1b4-23c6c48170b7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" Apr 22 18:45:31.430294 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:45:31.430231 2578 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 22 18:45:31.430329 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:45:31.430303 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e94feca-ca43-412c-b1b4-23c6c48170b7-plugin-serving-cert podName:6e94feca-ca43-412c-b1b4-23c6c48170b7 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:31.93028018 +0000 UTC m=+460.920546793 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/6e94feca-ca43-412c-b1b4-23c6c48170b7-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-ddglq" (UID: "6e94feca-ca43-412c-b1b4-23c6c48170b7") : secret "plugin-serving-cert" not found Apr 22 18:45:31.430676 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.430658 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6e94feca-ca43-412c-b1b4-23c6c48170b7-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ddglq\" (UID: \"6e94feca-ca43-412c-b1b4-23c6c48170b7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" Apr 22 18:45:31.438794 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.438771 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfvtf\" (UniqueName: \"kubernetes.io/projected/6e94feca-ca43-412c-b1b4-23c6c48170b7-kube-api-access-xfvtf\") pod \"kuadrant-console-plugin-6c886788f8-ddglq\" (UID: \"6e94feca-ca43-412c-b1b4-23c6c48170b7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" Apr 22 18:45:31.540387 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.540313 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa104b99-ce27-4823-9997-a0cac82f7b63" path="/var/lib/kubelet/pods/aa104b99-ce27-4823-9997-a0cac82f7b63/volumes" Apr 22 18:45:31.933568 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.933520 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e94feca-ca43-412c-b1b4-23c6c48170b7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ddglq\" (UID: \"6e94feca-ca43-412c-b1b4-23c6c48170b7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" Apr 22 18:45:31.935966 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:31.935941 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e94feca-ca43-412c-b1b4-23c6c48170b7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ddglq\" (UID: \"6e94feca-ca43-412c-b1b4-23c6c48170b7\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" Apr 22 18:45:32.113243 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:32.113199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" Apr 22 18:45:32.232603 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:32.232572 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq"] Apr 22 18:45:32.235222 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:45:32.235195 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e94feca_ca43_412c_b1b4_23c6c48170b7.slice/crio-c80b6bff217afcddc05fd771c7a39df7aa086b275f6f0ae8de58be459da4823f WatchSource:0}: Error finding container c80b6bff217afcddc05fd771c7a39df7aa086b275f6f0ae8de58be459da4823f: Status 404 returned error can't find the container with id c80b6bff217afcddc05fd771c7a39df7aa086b275f6f0ae8de58be459da4823f Apr 22 18:45:32.952910 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:32.952869 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" event={"ID":"6e94feca-ca43-412c-b1b4-23c6c48170b7","Type":"ContainerStarted","Data":"c80b6bff217afcddc05fd771c7a39df7aa086b275f6f0ae8de58be459da4823f"} Apr 22 18:45:36.968248 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:36.968204 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" event={"ID":"6e94feca-ca43-412c-b1b4-23c6c48170b7","Type":"ContainerStarted","Data":"951a3515c03d93d50f9417fc5ba9081501e2a995523d3e064526d652870804b3"} Apr 22 18:45:36.985680 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:45:36.985625 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ddglq" podStartSLOduration=1.370087058 podStartE2EDuration="5.985609057s" podCreationTimestamp="2026-04-22 18:45:31 +0000 UTC" firstStartedPulling="2026-04-22 18:45:32.236880863 +0000 UTC m=+461.227147469" lastFinishedPulling="2026-04-22 18:45:36.852402858 +0000 UTC m=+465.842669468" observedRunningTime="2026-04-22 18:45:36.984329586 +0000 UTC m=+465.974596227" watchObservedRunningTime="2026-04-22 18:45:36.985609057 +0000 UTC m=+465.975875686" Apr 22 18:46:16.435673 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.435635 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-zs7qk"] Apr 22 18:46:16.438844 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.438825 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" Apr 22 18:46:16.441311 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.441289 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 18:46:16.448479 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.448435 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-zs7qk"] Apr 22 18:46:16.486682 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.486643 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fb662351-b41b-411f-826c-57c62d7be5cd-config-file\") pod \"limitador-limitador-64c8f475fb-zs7qk\" (UID: \"fb662351-b41b-411f-826c-57c62d7be5cd\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" Apr 22 18:46:16.486860 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.486699 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvdhz\" (UniqueName: \"kubernetes.io/projected/fb662351-b41b-411f-826c-57c62d7be5cd-kube-api-access-fvdhz\") pod \"limitador-limitador-64c8f475fb-zs7qk\" (UID: \"fb662351-b41b-411f-826c-57c62d7be5cd\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" Apr 22 18:46:16.531356 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.531317 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-zs7qk"] Apr 22 18:46:16.587808 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.587766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvdhz\" (UniqueName: \"kubernetes.io/projected/fb662351-b41b-411f-826c-57c62d7be5cd-kube-api-access-fvdhz\") pod \"limitador-limitador-64c8f475fb-zs7qk\" (UID: \"fb662351-b41b-411f-826c-57c62d7be5cd\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" Apr 22 18:46:16.587994 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.587837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fb662351-b41b-411f-826c-57c62d7be5cd-config-file\") pod \"limitador-limitador-64c8f475fb-zs7qk\" (UID: \"fb662351-b41b-411f-826c-57c62d7be5cd\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" Apr 22 18:46:16.588390 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.588372 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fb662351-b41b-411f-826c-57c62d7be5cd-config-file\") pod \"limitador-limitador-64c8f475fb-zs7qk\" (UID: \"fb662351-b41b-411f-826c-57c62d7be5cd\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" Apr 22 18:46:16.596686 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.596651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvdhz\" (UniqueName: \"kubernetes.io/projected/fb662351-b41b-411f-826c-57c62d7be5cd-kube-api-access-fvdhz\") pod \"limitador-limitador-64c8f475fb-zs7qk\" (UID: \"fb662351-b41b-411f-826c-57c62d7be5cd\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" Apr 22 18:46:16.750364 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.750270 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" Apr 22 18:46:16.875369 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:16.875330 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-zs7qk"] Apr 22 18:46:16.878708 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:46:16.878680 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb662351_b41b_411f_826c_57c62d7be5cd.slice/crio-a44a01a67ad021169778358e2458d8f5d5950cee15bf08cdece07f571e1a6347 WatchSource:0}: Error finding container a44a01a67ad021169778358e2458d8f5d5950cee15bf08cdece07f571e1a6347: Status 404 returned error can't find the container with id a44a01a67ad021169778358e2458d8f5d5950cee15bf08cdece07f571e1a6347 Apr 22 18:46:17.093792 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.093698 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" event={"ID":"fb662351-b41b-411f-826c-57c62d7be5cd","Type":"ContainerStarted","Data":"a44a01a67ad021169778358e2458d8f5d5950cee15bf08cdece07f571e1a6347"} Apr 22 18:46:17.257086 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.257044 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-7hvjj"] Apr 22 18:46:17.262089 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.262060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-7hvjj" Apr 22 18:46:17.264544 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.264500 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-gww7g\"" Apr 22 18:46:17.267227 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.267200 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-7hvjj"] Apr 22 18:46:17.295219 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.295181 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q58q\" (UniqueName: \"kubernetes.io/projected/15be469d-0635-4c57-bc72-d64c508c34d4-kube-api-access-8q58q\") pod \"authorino-674b59b84c-7hvjj\" (UID: \"15be469d-0635-4c57-bc72-d64c508c34d4\") " pod="kuadrant-system/authorino-674b59b84c-7hvjj" Apr 22 18:46:17.395770 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.395739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q58q\" (UniqueName: \"kubernetes.io/projected/15be469d-0635-4c57-bc72-d64c508c34d4-kube-api-access-8q58q\") pod \"authorino-674b59b84c-7hvjj\" (UID: \"15be469d-0635-4c57-bc72-d64c508c34d4\") " pod="kuadrant-system/authorino-674b59b84c-7hvjj" Apr 22 18:46:17.403922 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.403886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q58q\" (UniqueName: \"kubernetes.io/projected/15be469d-0635-4c57-bc72-d64c508c34d4-kube-api-access-8q58q\") pod \"authorino-674b59b84c-7hvjj\" (UID: \"15be469d-0635-4c57-bc72-d64c508c34d4\") " pod="kuadrant-system/authorino-674b59b84c-7hvjj" Apr 22 18:46:17.457316 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.457284 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-6pps4"] Apr 22 18:46:17.460410 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.460395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-6pps4" Apr 22 18:46:17.464678 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.464635 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-6pps4"] Apr 22 18:46:17.496140 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.496107 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq7wq\" (UniqueName: \"kubernetes.io/projected/c20afde0-9486-45e8-a264-7ffea4bd9c83-kube-api-access-tq7wq\") pod \"authorino-79cbc94b89-6pps4\" (UID: \"c20afde0-9486-45e8-a264-7ffea4bd9c83\") " pod="kuadrant-system/authorino-79cbc94b89-6pps4" Apr 22 18:46:17.574172 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.574129 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-7hvjj" Apr 22 18:46:17.597440 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.597401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tq7wq\" (UniqueName: \"kubernetes.io/projected/c20afde0-9486-45e8-a264-7ffea4bd9c83-kube-api-access-tq7wq\") pod \"authorino-79cbc94b89-6pps4\" (UID: \"c20afde0-9486-45e8-a264-7ffea4bd9c83\") " pod="kuadrant-system/authorino-79cbc94b89-6pps4" Apr 22 18:46:17.606730 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.606688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq7wq\" (UniqueName: \"kubernetes.io/projected/c20afde0-9486-45e8-a264-7ffea4bd9c83-kube-api-access-tq7wq\") pod \"authorino-79cbc94b89-6pps4\" (UID: \"c20afde0-9486-45e8-a264-7ffea4bd9c83\") " pod="kuadrant-system/authorino-79cbc94b89-6pps4" Apr 22 18:46:17.693680 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.693635 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-7hvjj"] Apr 22 18:46:17.696377 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:46:17.696346 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15be469d_0635_4c57_bc72_d64c508c34d4.slice/crio-c3c19dd9e9285284e3371b20b40fc92b8faccb4673cc5358eebe2d38d2f092b2 WatchSource:0}: Error finding container c3c19dd9e9285284e3371b20b40fc92b8faccb4673cc5358eebe2d38d2f092b2: Status 404 returned error can't find the container with id c3c19dd9e9285284e3371b20b40fc92b8faccb4673cc5358eebe2d38d2f092b2 Apr 22 18:46:17.770851 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.770798 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-6pps4" Apr 22 18:46:17.893028 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:17.893004 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-6pps4"] Apr 22 18:46:17.895301 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:46:17.895273 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc20afde0_9486_45e8_a264_7ffea4bd9c83.slice/crio-c52abead9e51ee9672987f5a6ca2771a7a6192b1831262fcf184485bd098af01 WatchSource:0}: Error finding container c52abead9e51ee9672987f5a6ca2771a7a6192b1831262fcf184485bd098af01: Status 404 returned error can't find the container with id c52abead9e51ee9672987f5a6ca2771a7a6192b1831262fcf184485bd098af01 Apr 22 18:46:18.098710 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:18.098616 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-6pps4" event={"ID":"c20afde0-9486-45e8-a264-7ffea4bd9c83","Type":"ContainerStarted","Data":"c52abead9e51ee9672987f5a6ca2771a7a6192b1831262fcf184485bd098af01"} Apr 22 18:46:18.101064 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:18.101009 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-7hvjj" event={"ID":"15be469d-0635-4c57-bc72-d64c508c34d4","Type":"ContainerStarted","Data":"c3c19dd9e9285284e3371b20b40fc92b8faccb4673cc5358eebe2d38d2f092b2"} Apr 22 18:46:19.108448 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:19.107576 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" event={"ID":"fb662351-b41b-411f-826c-57c62d7be5cd","Type":"ContainerStarted","Data":"7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a"} Apr 22 18:46:19.108448 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:19.108396 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" Apr 22 18:46:19.127978 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:19.127051 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" podStartSLOduration=1.49510847 podStartE2EDuration="3.12703002s" podCreationTimestamp="2026-04-22 18:46:16 +0000 UTC" firstStartedPulling="2026-04-22 18:46:16.880438961 +0000 UTC m=+505.870705568" lastFinishedPulling="2026-04-22 18:46:18.512360511 +0000 UTC m=+507.502627118" observedRunningTime="2026-04-22 18:46:19.124991485 +0000 UTC m=+508.115258115" watchObservedRunningTime="2026-04-22 18:46:19.12703002 +0000 UTC m=+508.117296650" Apr 22 18:46:21.115612 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:21.115520 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-6pps4" event={"ID":"c20afde0-9486-45e8-a264-7ffea4bd9c83","Type":"ContainerStarted","Data":"b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace"} Apr 22 18:46:21.116875 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:21.116850 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-7hvjj" event={"ID":"15be469d-0635-4c57-bc72-d64c508c34d4","Type":"ContainerStarted","Data":"2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7"} Apr 22 18:46:21.130167 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:21.130121 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-6pps4" podStartSLOduration=1.166997418 podStartE2EDuration="4.130105833s" podCreationTimestamp="2026-04-22 18:46:17 +0000 UTC" firstStartedPulling="2026-04-22 18:46:17.896567913 +0000 UTC m=+506.886834524" lastFinishedPulling="2026-04-22 18:46:20.859676333 +0000 UTC m=+509.849942939" observedRunningTime="2026-04-22 18:46:21.129722284 +0000 UTC m=+510.119988915" watchObservedRunningTime="2026-04-22 18:46:21.130105833 +0000 UTC m=+510.120372462" Apr 22 18:46:21.144171 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:21.144115 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-7hvjj" podStartSLOduration=0.972010116 podStartE2EDuration="4.144098925s" podCreationTimestamp="2026-04-22 18:46:17 +0000 UTC" firstStartedPulling="2026-04-22 18:46:17.697780922 +0000 UTC m=+506.688047538" lastFinishedPulling="2026-04-22 18:46:20.86986973 +0000 UTC m=+509.860136347" observedRunningTime="2026-04-22 18:46:21.143336955 +0000 UTC m=+510.133603585" watchObservedRunningTime="2026-04-22 18:46:21.144098925 +0000 UTC m=+510.134365554" Apr 22 18:46:21.165887 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:21.165849 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-7hvjj"] Apr 22 18:46:23.122954 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:23.122879 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-7hvjj" podUID="15be469d-0635-4c57-bc72-d64c508c34d4" containerName="authorino" containerID="cri-o://2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7" gracePeriod=30 Apr 22 18:46:23.366350 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:23.366325 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-7hvjj" Apr 22 18:46:23.450722 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:23.450691 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q58q\" (UniqueName: \"kubernetes.io/projected/15be469d-0635-4c57-bc72-d64c508c34d4-kube-api-access-8q58q\") pod \"15be469d-0635-4c57-bc72-d64c508c34d4\" (UID: \"15be469d-0635-4c57-bc72-d64c508c34d4\") " Apr 22 18:46:23.452933 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:23.452902 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15be469d-0635-4c57-bc72-d64c508c34d4-kube-api-access-8q58q" (OuterVolumeSpecName: "kube-api-access-8q58q") pod "15be469d-0635-4c57-bc72-d64c508c34d4" (UID: "15be469d-0635-4c57-bc72-d64c508c34d4"). InnerVolumeSpecName "kube-api-access-8q58q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:23.551834 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:23.551805 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8q58q\" (UniqueName: \"kubernetes.io/projected/15be469d-0635-4c57-bc72-d64c508c34d4-kube-api-access-8q58q\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:46:24.126630 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:24.126586 2578 generic.go:358] "Generic (PLEG): container finished" podID="15be469d-0635-4c57-bc72-d64c508c34d4" containerID="2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7" exitCode=0 Apr 22 18:46:24.127119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:24.126640 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-7hvjj" Apr 22 18:46:24.127119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:24.126681 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-7hvjj" event={"ID":"15be469d-0635-4c57-bc72-d64c508c34d4","Type":"ContainerDied","Data":"2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7"} Apr 22 18:46:24.127119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:24.126723 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-7hvjj" event={"ID":"15be469d-0635-4c57-bc72-d64c508c34d4","Type":"ContainerDied","Data":"c3c19dd9e9285284e3371b20b40fc92b8faccb4673cc5358eebe2d38d2f092b2"} Apr 22 18:46:24.127119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:24.126745 2578 scope.go:117] "RemoveContainer" containerID="2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7" Apr 22 18:46:24.134508 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:24.134478 2578 scope.go:117] "RemoveContainer" containerID="2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7" Apr 22 18:46:24.134840 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:46:24.134820 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7\": container with ID starting with 2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7 not found: ID does not exist" containerID="2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7" Apr 22 18:46:24.134890 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:24.134851 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7"} err="failed to get container status \"2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7\": rpc error: code = NotFound desc = could not find container \"2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7\": container with ID starting with 2be4af7b61f6c826d2611c57a57ce8d919f9f94d4f651f6b0b151b7f46ed76e7 not found: ID does not exist" Apr 22 18:46:24.142870 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:24.142839 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-7hvjj"] Apr 22 18:46:24.146813 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:24.146786 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-7hvjj"] Apr 22 18:46:25.539877 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:25.539841 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15be469d-0635-4c57-bc72-d64c508c34d4" path="/var/lib/kubelet/pods/15be469d-0635-4c57-bc72-d64c508c34d4/volumes" Apr 22 18:46:31.118260 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:31.118227 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" Apr 22 18:46:32.547450 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:32.547414 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-zs7qk"] Apr 22 18:46:32.547890 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:32.547671 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" podUID="fb662351-b41b-411f-826c-57c62d7be5cd" containerName="limitador" containerID="cri-o://7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a" gracePeriod=30 Apr 22 18:46:33.093616 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.093585 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" Apr 22 18:46:33.156989 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.156957 2578 generic.go:358] "Generic (PLEG): container finished" podID="fb662351-b41b-411f-826c-57c62d7be5cd" containerID="7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a" exitCode=0 Apr 22 18:46:33.157173 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.156998 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" event={"ID":"fb662351-b41b-411f-826c-57c62d7be5cd","Type":"ContainerDied","Data":"7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a"} Apr 22 18:46:33.157173 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.157019 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" Apr 22 18:46:33.157173 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.157032 2578 scope.go:117] "RemoveContainer" containerID="7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a" Apr 22 18:46:33.157173 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.157022 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-zs7qk" event={"ID":"fb662351-b41b-411f-826c-57c62d7be5cd","Type":"ContainerDied","Data":"a44a01a67ad021169778358e2458d8f5d5950cee15bf08cdece07f571e1a6347"} Apr 22 18:46:33.164644 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.164621 2578 scope.go:117] "RemoveContainer" containerID="7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a" Apr 22 18:46:33.164913 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:46:33.164893 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a\": container with ID starting with 7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a not found: ID does not exist" containerID="7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a" Apr 22 18:46:33.164982 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.164923 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a"} err="failed to get container status \"7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a\": rpc error: code = NotFound desc = could not find container \"7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a\": container with ID starting with 7e3bb61747556dc8cd374336c19408dd25664b82bb39a397f21ecaee277d7f7a not found: ID does not exist" Apr 22 18:46:33.224597 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.224561 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fb662351-b41b-411f-826c-57c62d7be5cd-config-file\") pod \"fb662351-b41b-411f-826c-57c62d7be5cd\" (UID: \"fb662351-b41b-411f-826c-57c62d7be5cd\") " Apr 22 18:46:33.224760 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.224612 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvdhz\" (UniqueName: \"kubernetes.io/projected/fb662351-b41b-411f-826c-57c62d7be5cd-kube-api-access-fvdhz\") pod \"fb662351-b41b-411f-826c-57c62d7be5cd\" (UID: \"fb662351-b41b-411f-826c-57c62d7be5cd\") " Apr 22 18:46:33.224993 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.224959 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb662351-b41b-411f-826c-57c62d7be5cd-config-file" (OuterVolumeSpecName: "config-file") pod "fb662351-b41b-411f-826c-57c62d7be5cd" (UID: "fb662351-b41b-411f-826c-57c62d7be5cd"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:33.226770 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.226747 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb662351-b41b-411f-826c-57c62d7be5cd-kube-api-access-fvdhz" (OuterVolumeSpecName: "kube-api-access-fvdhz") pod "fb662351-b41b-411f-826c-57c62d7be5cd" (UID: "fb662351-b41b-411f-826c-57c62d7be5cd"). InnerVolumeSpecName "kube-api-access-fvdhz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:33.325309 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.325271 2578 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fb662351-b41b-411f-826c-57c62d7be5cd-config-file\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:46:33.325309 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.325305 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fvdhz\" (UniqueName: \"kubernetes.io/projected/fb662351-b41b-411f-826c-57c62d7be5cd-kube-api-access-fvdhz\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:46:33.478991 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.478954 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-zs7qk"] Apr 22 18:46:33.483105 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.483069 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-zs7qk"] Apr 22 18:46:33.539431 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:33.539386 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb662351-b41b-411f-826c-57c62d7be5cd" path="/var/lib/kubelet/pods/fb662351-b41b-411f-826c-57c62d7be5cd/volumes" Apr 22 18:46:41.238324 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.238241 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-sqm4w"] Apr 22 18:46:41.238704 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.238560 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15be469d-0635-4c57-bc72-d64c508c34d4" containerName="authorino" Apr 22 18:46:41.238704 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.238571 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15be469d-0635-4c57-bc72-d64c508c34d4" containerName="authorino" Apr 22 18:46:41.238704 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.238590 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb662351-b41b-411f-826c-57c62d7be5cd" containerName="limitador" Apr 22 18:46:41.238704 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.238595 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb662351-b41b-411f-826c-57c62d7be5cd" containerName="limitador" Apr 22 18:46:41.238704 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.238646 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="15be469d-0635-4c57-bc72-d64c508c34d4" containerName="authorino" Apr 22 18:46:41.238704 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.238654 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb662351-b41b-411f-826c-57c62d7be5cd" containerName="limitador" Apr 22 18:46:41.241469 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.241437 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-sqm4w" Apr 22 18:46:41.243701 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.243681 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 18:46:41.247755 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.247728 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-sqm4w"] Apr 22 18:46:41.392314 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.392273 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw98h\" (UniqueName: \"kubernetes.io/projected/425c87f5-c0cb-48be-9762-4f5d43de58f6-kube-api-access-pw98h\") pod \"authorino-68bd676465-sqm4w\" (UID: \"425c87f5-c0cb-48be-9762-4f5d43de58f6\") " pod="kuadrant-system/authorino-68bd676465-sqm4w" Apr 22 18:46:41.392539 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.392321 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/425c87f5-c0cb-48be-9762-4f5d43de58f6-tls-cert\") pod \"authorino-68bd676465-sqm4w\" (UID: \"425c87f5-c0cb-48be-9762-4f5d43de58f6\") " pod="kuadrant-system/authorino-68bd676465-sqm4w" Apr 22 18:46:41.493733 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.493639 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw98h\" (UniqueName: \"kubernetes.io/projected/425c87f5-c0cb-48be-9762-4f5d43de58f6-kube-api-access-pw98h\") pod \"authorino-68bd676465-sqm4w\" (UID: \"425c87f5-c0cb-48be-9762-4f5d43de58f6\") " pod="kuadrant-system/authorino-68bd676465-sqm4w" Apr 22 18:46:41.493733 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.493686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/425c87f5-c0cb-48be-9762-4f5d43de58f6-tls-cert\") pod \"authorino-68bd676465-sqm4w\" (UID: \"425c87f5-c0cb-48be-9762-4f5d43de58f6\") " pod="kuadrant-system/authorino-68bd676465-sqm4w" Apr 22 18:46:41.496127 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.496094 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/425c87f5-c0cb-48be-9762-4f5d43de58f6-tls-cert\") pod \"authorino-68bd676465-sqm4w\" (UID: \"425c87f5-c0cb-48be-9762-4f5d43de58f6\") " pod="kuadrant-system/authorino-68bd676465-sqm4w" Apr 22 18:46:41.503277 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.503252 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw98h\" (UniqueName: \"kubernetes.io/projected/425c87f5-c0cb-48be-9762-4f5d43de58f6-kube-api-access-pw98h\") pod \"authorino-68bd676465-sqm4w\" (UID: \"425c87f5-c0cb-48be-9762-4f5d43de58f6\") " pod="kuadrant-system/authorino-68bd676465-sqm4w" Apr 22 18:46:41.550880 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.550835 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-sqm4w" Apr 22 18:46:41.678727 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:41.678582 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-sqm4w"] Apr 22 18:46:41.681322 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:46:41.681287 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod425c87f5_c0cb_48be_9762_4f5d43de58f6.slice/crio-295ddc3ea21e4ee50eefa4197703113183335afd448575a0554b2c78c6a29ee6 WatchSource:0}: Error finding container 295ddc3ea21e4ee50eefa4197703113183335afd448575a0554b2c78c6a29ee6: Status 404 returned error can't find the container with id 295ddc3ea21e4ee50eefa4197703113183335afd448575a0554b2c78c6a29ee6 Apr 22 18:46:42.190509 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:42.190447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-sqm4w" event={"ID":"425c87f5-c0cb-48be-9762-4f5d43de58f6","Type":"ContainerStarted","Data":"295ddc3ea21e4ee50eefa4197703113183335afd448575a0554b2c78c6a29ee6"} Apr 22 18:46:43.194809 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:43.194766 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-sqm4w" event={"ID":"425c87f5-c0cb-48be-9762-4f5d43de58f6","Type":"ContainerStarted","Data":"3f902eb4711ec45acdaae1101c8bf7d298a012deddc74e2455d9d1db2d089b25"} Apr 22 18:46:43.211070 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:43.211014 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-sqm4w" podStartSLOduration=1.694052091 podStartE2EDuration="2.21099821s" podCreationTimestamp="2026-04-22 18:46:41 +0000 UTC" firstStartedPulling="2026-04-22 18:46:41.682779445 +0000 UTC m=+530.673046053" lastFinishedPulling="2026-04-22 18:46:42.199725562 +0000 UTC m=+531.189992172" observedRunningTime="2026-04-22 18:46:43.209884784 +0000 UTC m=+532.200151404" watchObservedRunningTime="2026-04-22 18:46:43.21099821 +0000 UTC m=+532.201264839" Apr 22 18:46:43.238819 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:43.238780 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-6pps4"] Apr 22 18:46:43.239078 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:43.239050 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-6pps4" podUID="c20afde0-9486-45e8-a264-7ffea4bd9c83" containerName="authorino" containerID="cri-o://b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace" gracePeriod=30 Apr 22 18:46:43.474599 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:43.474574 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-6pps4" Apr 22 18:46:43.611907 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:43.611866 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq7wq\" (UniqueName: \"kubernetes.io/projected/c20afde0-9486-45e8-a264-7ffea4bd9c83-kube-api-access-tq7wq\") pod \"c20afde0-9486-45e8-a264-7ffea4bd9c83\" (UID: \"c20afde0-9486-45e8-a264-7ffea4bd9c83\") " Apr 22 18:46:43.614023 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:43.613980 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c20afde0-9486-45e8-a264-7ffea4bd9c83-kube-api-access-tq7wq" (OuterVolumeSpecName: "kube-api-access-tq7wq") pod "c20afde0-9486-45e8-a264-7ffea4bd9c83" (UID: "c20afde0-9486-45e8-a264-7ffea4bd9c83"). InnerVolumeSpecName "kube-api-access-tq7wq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:43.713255 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:43.713141 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tq7wq\" (UniqueName: \"kubernetes.io/projected/c20afde0-9486-45e8-a264-7ffea4bd9c83-kube-api-access-tq7wq\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:46:44.199403 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:44.199365 2578 generic.go:358] "Generic (PLEG): container finished" podID="c20afde0-9486-45e8-a264-7ffea4bd9c83" containerID="b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace" exitCode=0 Apr 22 18:46:44.199856 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:44.199419 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-6pps4" Apr 22 18:46:44.199856 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:44.199455 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-6pps4" event={"ID":"c20afde0-9486-45e8-a264-7ffea4bd9c83","Type":"ContainerDied","Data":"b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace"} Apr 22 18:46:44.199856 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:44.199515 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-6pps4" event={"ID":"c20afde0-9486-45e8-a264-7ffea4bd9c83","Type":"ContainerDied","Data":"c52abead9e51ee9672987f5a6ca2771a7a6192b1831262fcf184485bd098af01"} Apr 22 18:46:44.199856 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:44.199533 2578 scope.go:117] "RemoveContainer" containerID="b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace" Apr 22 18:46:44.207435 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:44.207414 2578 scope.go:117] "RemoveContainer" containerID="b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace" Apr 22 18:46:44.207748 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:46:44.207728 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace\": container with ID starting with b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace not found: ID does not exist" containerID="b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace" Apr 22 18:46:44.207790 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:44.207759 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace"} err="failed to get container status \"b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace\": rpc error: code = NotFound desc = could not find container \"b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace\": container with ID starting with b76ec135d487f5c604796ce9e0a6c0f6a7b15ea55d066c32af42502f84f32ace not found: ID does not exist" Apr 22 18:46:44.220589 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:44.220554 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-6pps4"] Apr 22 18:46:44.227309 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:44.227275 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-6pps4"] Apr 22 18:46:45.539547 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:46:45.539514 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c20afde0-9486-45e8-a264-7ffea4bd9c83" path="/var/lib/kubelet/pods/c20afde0-9486-45e8-a264-7ffea4bd9c83/volumes" Apr 22 18:47:00.002395 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.002358 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-dc64b4c84-fsldz"] Apr 22 18:47:00.002860 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.002807 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c20afde0-9486-45e8-a264-7ffea4bd9c83" containerName="authorino" Apr 22 18:47:00.002860 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.002823 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20afde0-9486-45e8-a264-7ffea4bd9c83" containerName="authorino" Apr 22 18:47:00.002932 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.002878 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c20afde0-9486-45e8-a264-7ffea4bd9c83" containerName="authorino" Apr 22 18:47:00.006616 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.006592 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" Apr 22 18:47:00.009007 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.008985 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:47:00.009212 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.009192 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:47:00.010032 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.010005 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-dt769\"" Apr 22 18:47:00.010129 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.010014 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 18:47:00.014494 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.014453 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-dc64b4c84-fsldz"] Apr 22 18:47:00.151911 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.151859 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxq6t\" (UniqueName: \"kubernetes.io/projected/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3-kube-api-access-sxq6t\") pod \"llmisvc-controller-manager-dc64b4c84-fsldz\" (UID: \"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3\") " pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" Apr 22 18:47:00.152090 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.151985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3-cert\") pod \"llmisvc-controller-manager-dc64b4c84-fsldz\" (UID: \"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3\") " pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" Apr 22 18:47:00.253225 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.253136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3-cert\") pod \"llmisvc-controller-manager-dc64b4c84-fsldz\" (UID: \"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3\") " pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" Apr 22 18:47:00.253225 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.253175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxq6t\" (UniqueName: \"kubernetes.io/projected/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3-kube-api-access-sxq6t\") pod \"llmisvc-controller-manager-dc64b4c84-fsldz\" (UID: \"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3\") " pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" Apr 22 18:47:00.255602 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.255581 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3-cert\") pod \"llmisvc-controller-manager-dc64b4c84-fsldz\" (UID: \"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3\") " pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" Apr 22 18:47:00.263416 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.263392 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxq6t\" (UniqueName: \"kubernetes.io/projected/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3-kube-api-access-sxq6t\") pod \"llmisvc-controller-manager-dc64b4c84-fsldz\" (UID: \"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3\") " pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" Apr 22 18:47:00.317980 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.317945 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" Apr 22 18:47:00.445537 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:00.445500 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-dc64b4c84-fsldz"] Apr 22 18:47:00.448790 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:47:00.448749 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd4cbe54f_548a_471a_b49b_9a8f2f8a25c3.slice/crio-7de378c261e2c73b9ece649262c34154e1c2294090aaff47bf5ed2ded2ee9c6e WatchSource:0}: Error finding container 7de378c261e2c73b9ece649262c34154e1c2294090aaff47bf5ed2ded2ee9c6e: Status 404 returned error can't find the container with id 7de378c261e2c73b9ece649262c34154e1c2294090aaff47bf5ed2ded2ee9c6e Apr 22 18:47:01.254406 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:01.254369 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" event={"ID":"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3","Type":"ContainerStarted","Data":"7de378c261e2c73b9ece649262c34154e1c2294090aaff47bf5ed2ded2ee9c6e"} Apr 22 18:47:04.264996 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:04.264949 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" event={"ID":"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3","Type":"ContainerStarted","Data":"348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09"} Apr 22 18:47:04.265403 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:04.265081 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" Apr 22 18:47:04.283054 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:04.282998 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" podStartSLOduration=1.698191768 podStartE2EDuration="5.282956915s" podCreationTimestamp="2026-04-22 18:46:59 +0000 UTC" firstStartedPulling="2026-04-22 18:47:00.450220972 +0000 UTC m=+549.440487593" lastFinishedPulling="2026-04-22 18:47:04.03498613 +0000 UTC m=+553.025252740" observedRunningTime="2026-04-22 18:47:04.282292048 +0000 UTC m=+553.272558678" watchObservedRunningTime="2026-04-22 18:47:04.282956915 +0000 UTC m=+553.273223544" Apr 22 18:47:35.271553 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:35.271515 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" Apr 22 18:47:51.447781 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:51.447755 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 18:47:51.449087 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:47:51.449060 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 18:48:10.097775 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.097693 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-hj6k6"] Apr 22 18:48:10.101169 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.101148 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-hj6k6" Apr 22 18:48:10.103730 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.103704 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 18:48:10.103883 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.103769 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-hndn2\"" Apr 22 18:48:10.111324 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.111294 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-hj6k6"] Apr 22 18:48:10.187811 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.187773 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28f6h\" (UniqueName: \"kubernetes.io/projected/1a7017da-d849-49c5-a9f6-fe9997ffa457-kube-api-access-28f6h\") pod \"model-serving-api-86f7b4b499-hj6k6\" (UID: \"1a7017da-d849-49c5-a9f6-fe9997ffa457\") " pod="kserve/model-serving-api-86f7b4b499-hj6k6" Apr 22 18:48:10.187978 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.187826 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1a7017da-d849-49c5-a9f6-fe9997ffa457-tls-certs\") pod \"model-serving-api-86f7b4b499-hj6k6\" (UID: \"1a7017da-d849-49c5-a9f6-fe9997ffa457\") " pod="kserve/model-serving-api-86f7b4b499-hj6k6" Apr 22 18:48:10.289121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.289085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28f6h\" (UniqueName: \"kubernetes.io/projected/1a7017da-d849-49c5-a9f6-fe9997ffa457-kube-api-access-28f6h\") pod \"model-serving-api-86f7b4b499-hj6k6\" (UID: \"1a7017da-d849-49c5-a9f6-fe9997ffa457\") " pod="kserve/model-serving-api-86f7b4b499-hj6k6" Apr 22 18:48:10.289322 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.289140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1a7017da-d849-49c5-a9f6-fe9997ffa457-tls-certs\") pod \"model-serving-api-86f7b4b499-hj6k6\" (UID: \"1a7017da-d849-49c5-a9f6-fe9997ffa457\") " pod="kserve/model-serving-api-86f7b4b499-hj6k6" Apr 22 18:48:10.291676 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.291651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1a7017da-d849-49c5-a9f6-fe9997ffa457-tls-certs\") pod \"model-serving-api-86f7b4b499-hj6k6\" (UID: \"1a7017da-d849-49c5-a9f6-fe9997ffa457\") " pod="kserve/model-serving-api-86f7b4b499-hj6k6" Apr 22 18:48:10.297694 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.297666 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28f6h\" (UniqueName: \"kubernetes.io/projected/1a7017da-d849-49c5-a9f6-fe9997ffa457-kube-api-access-28f6h\") pod \"model-serving-api-86f7b4b499-hj6k6\" (UID: \"1a7017da-d849-49c5-a9f6-fe9997ffa457\") " pod="kserve/model-serving-api-86f7b4b499-hj6k6" Apr 22 18:48:10.411740 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.411701 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-hj6k6" Apr 22 18:48:10.535290 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:10.535106 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-hj6k6"] Apr 22 18:48:10.540727 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:48:10.540694 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a7017da_d849_49c5_a9f6_fe9997ffa457.slice/crio-3524875d43f70e48b5aca9bbc0b0cb4dc4bb998b655a728b9b0b721e1aefbfb7 WatchSource:0}: Error finding container 3524875d43f70e48b5aca9bbc0b0cb4dc4bb998b655a728b9b0b721e1aefbfb7: Status 404 returned error can't find the container with id 3524875d43f70e48b5aca9bbc0b0cb4dc4bb998b655a728b9b0b721e1aefbfb7 Apr 22 18:48:11.483107 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:11.483066 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-hj6k6" event={"ID":"1a7017da-d849-49c5-a9f6-fe9997ffa457","Type":"ContainerStarted","Data":"3524875d43f70e48b5aca9bbc0b0cb4dc4bb998b655a728b9b0b721e1aefbfb7"} Apr 22 18:48:13.492272 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:13.492233 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-hj6k6" event={"ID":"1a7017da-d849-49c5-a9f6-fe9997ffa457","Type":"ContainerStarted","Data":"f33860b5e537fae75acd0ec869b9362ebb2fd4bcfb121e0d7dd94621c6182891"} Apr 22 18:48:13.492704 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:13.492351 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-hj6k6" Apr 22 18:48:13.510031 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:13.509975 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-hj6k6" podStartSLOduration=1.181894996 podStartE2EDuration="3.509960414s" podCreationTimestamp="2026-04-22 18:48:10 +0000 UTC" firstStartedPulling="2026-04-22 18:48:10.542204401 +0000 UTC m=+619.532471009" lastFinishedPulling="2026-04-22 18:48:12.87026982 +0000 UTC m=+621.860536427" observedRunningTime="2026-04-22 18:48:13.508857777 +0000 UTC m=+622.499124441" watchObservedRunningTime="2026-04-22 18:48:13.509960414 +0000 UTC m=+622.500227039" Apr 22 18:48:24.501948 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:24.501913 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-hj6k6" Apr 22 18:48:26.171504 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:26.171455 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-t972h"] Apr 22 18:48:26.177655 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:26.177635 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-t972h" Apr 22 18:48:26.180427 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:26.180198 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:48:26.181094 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:26.181068 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-p9zql\"" Apr 22 18:48:26.182203 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:26.182181 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-t972h"] Apr 22 18:48:26.318624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:26.318580 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbjmv\" (UniqueName: \"kubernetes.io/projected/f57a8eb1-cc41-45bb-8108-8ddf1a2d5229-kube-api-access-mbjmv\") pod \"s3-init-t972h\" (UID: \"f57a8eb1-cc41-45bb-8108-8ddf1a2d5229\") " pod="kserve/s3-init-t972h" Apr 22 18:48:26.419653 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:26.419620 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbjmv\" (UniqueName: \"kubernetes.io/projected/f57a8eb1-cc41-45bb-8108-8ddf1a2d5229-kube-api-access-mbjmv\") pod \"s3-init-t972h\" (UID: \"f57a8eb1-cc41-45bb-8108-8ddf1a2d5229\") " pod="kserve/s3-init-t972h" Apr 22 18:48:26.429066 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:26.429001 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbjmv\" (UniqueName: \"kubernetes.io/projected/f57a8eb1-cc41-45bb-8108-8ddf1a2d5229-kube-api-access-mbjmv\") pod \"s3-init-t972h\" (UID: \"f57a8eb1-cc41-45bb-8108-8ddf1a2d5229\") " pod="kserve/s3-init-t972h" Apr 22 18:48:26.487480 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:26.487420 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-t972h" Apr 22 18:48:26.617605 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:26.617575 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-t972h"] Apr 22 18:48:26.620083 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:48:26.620048 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf57a8eb1_cc41_45bb_8108_8ddf1a2d5229.slice/crio-fbf341b30fdbe09aea1a8e45ac36cb12f597e56cc6fe2bfbed8f6720b4d69efc WatchSource:0}: Error finding container fbf341b30fdbe09aea1a8e45ac36cb12f597e56cc6fe2bfbed8f6720b4d69efc: Status 404 returned error can't find the container with id fbf341b30fdbe09aea1a8e45ac36cb12f597e56cc6fe2bfbed8f6720b4d69efc Apr 22 18:48:27.540149 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:27.540107 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-t972h" event={"ID":"f57a8eb1-cc41-45bb-8108-8ddf1a2d5229","Type":"ContainerStarted","Data":"fbf341b30fdbe09aea1a8e45ac36cb12f597e56cc6fe2bfbed8f6720b4d69efc"} Apr 22 18:48:31.555871 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:31.555837 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-t972h" event={"ID":"f57a8eb1-cc41-45bb-8108-8ddf1a2d5229","Type":"ContainerStarted","Data":"1a8f43263b890613b87c6fdb9b01e32a8e2b068dde39965325994aa239c6aafe"} Apr 22 18:48:31.572009 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:31.571898 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-t972h" podStartSLOduration=1.116250698 podStartE2EDuration="5.571879506s" podCreationTimestamp="2026-04-22 18:48:26 +0000 UTC" firstStartedPulling="2026-04-22 18:48:26.621942725 +0000 UTC m=+635.612209333" lastFinishedPulling="2026-04-22 18:48:31.077571533 +0000 UTC m=+640.067838141" observedRunningTime="2026-04-22 18:48:31.571243274 +0000 UTC m=+640.561509902" watchObservedRunningTime="2026-04-22 18:48:31.571879506 +0000 UTC m=+640.562146136" Apr 22 18:48:34.566146 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:34.566062 2578 generic.go:358] "Generic (PLEG): container finished" podID="f57a8eb1-cc41-45bb-8108-8ddf1a2d5229" containerID="1a8f43263b890613b87c6fdb9b01e32a8e2b068dde39965325994aa239c6aafe" exitCode=0 Apr 22 18:48:34.566146 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:34.566102 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-t972h" event={"ID":"f57a8eb1-cc41-45bb-8108-8ddf1a2d5229","Type":"ContainerDied","Data":"1a8f43263b890613b87c6fdb9b01e32a8e2b068dde39965325994aa239c6aafe"} Apr 22 18:48:35.693414 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:35.693391 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-t972h" Apr 22 18:48:35.791415 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:35.791376 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbjmv\" (UniqueName: \"kubernetes.io/projected/f57a8eb1-cc41-45bb-8108-8ddf1a2d5229-kube-api-access-mbjmv\") pod \"f57a8eb1-cc41-45bb-8108-8ddf1a2d5229\" (UID: \"f57a8eb1-cc41-45bb-8108-8ddf1a2d5229\") " Apr 22 18:48:35.793681 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:35.793654 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f57a8eb1-cc41-45bb-8108-8ddf1a2d5229-kube-api-access-mbjmv" (OuterVolumeSpecName: "kube-api-access-mbjmv") pod "f57a8eb1-cc41-45bb-8108-8ddf1a2d5229" (UID: "f57a8eb1-cc41-45bb-8108-8ddf1a2d5229"). InnerVolumeSpecName "kube-api-access-mbjmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:48:35.892682 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:35.892654 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mbjmv\" (UniqueName: \"kubernetes.io/projected/f57a8eb1-cc41-45bb-8108-8ddf1a2d5229-kube-api-access-mbjmv\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:48:36.573784 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:36.573754 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-t972h" Apr 22 18:48:36.573999 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:36.573754 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-t972h" event={"ID":"f57a8eb1-cc41-45bb-8108-8ddf1a2d5229","Type":"ContainerDied","Data":"fbf341b30fdbe09aea1a8e45ac36cb12f597e56cc6fe2bfbed8f6720b4d69efc"} Apr 22 18:48:36.573999 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:36.573865 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbf341b30fdbe09aea1a8e45ac36cb12f597e56cc6fe2bfbed8f6720b4d69efc" Apr 22 18:48:46.938633 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.938598 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr"] Apr 22 18:48:46.939111 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.939095 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f57a8eb1-cc41-45bb-8108-8ddf1a2d5229" containerName="s3-init" Apr 22 18:48:46.939166 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.939113 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57a8eb1-cc41-45bb-8108-8ddf1a2d5229" containerName="s3-init" Apr 22 18:48:46.939223 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.939180 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f57a8eb1-cc41-45bb-8108-8ddf1a2d5229" containerName="s3-init" Apr 22 18:48:46.942165 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.942142 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:46.944479 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.944438 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:48:46.944606 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.944565 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-78z46\"" Apr 22 18:48:46.944665 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.944652 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:48:46.945431 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.945409 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 22 18:48:46.950948 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.950839 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr"] Apr 22 18:48:46.979105 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.979068 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a51f3ffd-8fcd-4925-8919-0586473712a1-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:46.979275 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.979124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:46.979275 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.979163 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a51f3ffd-8fcd-4925-8919-0586473712a1-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:46.979275 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.979196 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:46.979275 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.979261 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcpg6\" (UniqueName: \"kubernetes.io/projected/a51f3ffd-8fcd-4925-8919-0586473712a1-kube-api-access-gcpg6\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:46.979527 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.979308 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:46.979527 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.979351 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:46.979527 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.979383 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:46.979527 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:46.979424 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a51f3ffd-8fcd-4925-8919-0586473712a1-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080143 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080143 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080141 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a51f3ffd-8fcd-4925-8919-0586473712a1-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080409 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080166 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080409 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcpg6\" (UniqueName: \"kubernetes.io/projected/a51f3ffd-8fcd-4925-8919-0586473712a1-kube-api-access-gcpg6\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080409 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080409 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080409 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080295 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080409 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080326 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a51f3ffd-8fcd-4925-8919-0586473712a1-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080409 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a51f3ffd-8fcd-4925-8919-0586473712a1-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080779 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080650 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080779 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080738 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080891 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.080891 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.080813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.081081 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.081062 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a51f3ffd-8fcd-4925-8919-0586473712a1-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.082652 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.082632 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a51f3ffd-8fcd-4925-8919-0586473712a1-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.082844 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.082826 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a51f3ffd-8fcd-4925-8919-0586473712a1-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.095118 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.095090 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a51f3ffd-8fcd-4925-8919-0586473712a1-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.095236 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.095215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcpg6\" (UniqueName: \"kubernetes.io/projected/a51f3ffd-8fcd-4925-8919-0586473712a1-kube-api-access-gcpg6\") pod \"router-gateway-1-openshift-default-6c59fbf55c-fxmjr\" (UID: \"a51f3ffd-8fcd-4925-8919-0586473712a1\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.254434 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.254347 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:47.381920 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.381892 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr"] Apr 22 18:48:47.383880 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:48:47.383854 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda51f3ffd_8fcd_4925_8919_0586473712a1.slice/crio-62df44b45b06ee8d53fd5b3915f2dbe8b826f4e61413467a304e1077b4ec3094 WatchSource:0}: Error finding container 62df44b45b06ee8d53fd5b3915f2dbe8b826f4e61413467a304e1077b4ec3094: Status 404 returned error can't find the container with id 62df44b45b06ee8d53fd5b3915f2dbe8b826f4e61413467a304e1077b4ec3094 Apr 22 18:48:47.385522 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.385507 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:48:47.608664 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:47.608576 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" event={"ID":"a51f3ffd-8fcd-4925-8919-0586473712a1","Type":"ContainerStarted","Data":"62df44b45b06ee8d53fd5b3915f2dbe8b826f4e61413467a304e1077b4ec3094"} Apr 22 18:48:49.966930 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:49.966885 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:48:49.967219 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:49.967006 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:48:49.967219 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:49.967041 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:48:50.620079 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:50.620034 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" event={"ID":"a51f3ffd-8fcd-4925-8919-0586473712a1","Type":"ContainerStarted","Data":"e70e889b65092ac0a08e39d8a705777fb699dbdfe24099614a5798eeff6e7bff"} Apr 22 18:48:50.641261 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:50.641195 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" podStartSLOduration=2.0601536 podStartE2EDuration="4.641178751s" podCreationTimestamp="2026-04-22 18:48:46 +0000 UTC" firstStartedPulling="2026-04-22 18:48:47.385631863 +0000 UTC m=+656.375898470" lastFinishedPulling="2026-04-22 18:48:49.966657002 +0000 UTC m=+658.956923621" observedRunningTime="2026-04-22 18:48:50.639651495 +0000 UTC m=+659.629918114" watchObservedRunningTime="2026-04-22 18:48:50.641178751 +0000 UTC m=+659.631445380" Apr 22 18:48:51.254874 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:51.254836 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:51.259682 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:51.259655 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:51.623561 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:51.623484 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:48:51.624578 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:48:51.624560 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-fxmjr" Apr 22 18:49:09.298556 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.298516 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl"] Apr 22 18:49:09.336034 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.335991 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl"] Apr 22 18:49:09.336207 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.336131 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.338956 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.338919 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 22 18:49:09.339917 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.339885 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 18:49:09.477560 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.477514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.477756 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.477577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s4c7\" (UniqueName: \"kubernetes.io/projected/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-kube-api-access-9s4c7\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.477756 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.477651 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.477756 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.477718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.477756 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.477752 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.477917 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.477796 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.578281 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.578176 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9s4c7\" (UniqueName: \"kubernetes.io/projected/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-kube-api-access-9s4c7\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.578281 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.578227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.578281 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.578252 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.578281 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.578271 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.578672 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.578304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.578672 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.578326 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.578827 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.578804 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.578884 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.578822 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.578884 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.578859 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.580790 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.580766 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.580902 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.580863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.586715 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.586691 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s4c7\" (UniqueName: \"kubernetes.io/projected/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-kube-api-access-9s4c7\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.647255 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.647215 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:09.782299 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:09.782273 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl"] Apr 22 18:49:09.785162 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:49:09.785130 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48c0a407_5ec4_4561_bff4_2d1b9d955b1f.slice/crio-a786fe9945fb64b96710f213218f8413f9ce868516dda6ca3bf1480594890704 WatchSource:0}: Error finding container a786fe9945fb64b96710f213218f8413f9ce868516dda6ca3bf1480594890704: Status 404 returned error can't find the container with id a786fe9945fb64b96710f213218f8413f9ce868516dda6ca3bf1480594890704 Apr 22 18:49:10.685396 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:10.685350 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" event={"ID":"48c0a407-5ec4-4561-bff4-2d1b9d955b1f","Type":"ContainerStarted","Data":"a786fe9945fb64b96710f213218f8413f9ce868516dda6ca3bf1480594890704"} Apr 22 18:49:13.696869 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:13.696829 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" event={"ID":"48c0a407-5ec4-4561-bff4-2d1b9d955b1f","Type":"ContainerStarted","Data":"281d0549bdcd01fd82522502ced965049f3019b4823fd838bf5dbb3663ead7f6"} Apr 22 18:49:17.713446 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:17.713409 2578 generic.go:358] "Generic (PLEG): container finished" podID="48c0a407-5ec4-4561-bff4-2d1b9d955b1f" containerID="281d0549bdcd01fd82522502ced965049f3019b4823fd838bf5dbb3663ead7f6" exitCode=0 Apr 22 18:49:17.713858 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:17.713502 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" event={"ID":"48c0a407-5ec4-4561-bff4-2d1b9d955b1f","Type":"ContainerDied","Data":"281d0549bdcd01fd82522502ced965049f3019b4823fd838bf5dbb3663ead7f6"} Apr 22 18:49:19.722869 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:19.722834 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" event={"ID":"48c0a407-5ec4-4561-bff4-2d1b9d955b1f","Type":"ContainerStarted","Data":"a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442"} Apr 22 18:49:19.743720 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:19.743658 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" podStartSLOduration=1.715739084 podStartE2EDuration="10.743643751s" podCreationTimestamp="2026-04-22 18:49:09 +0000 UTC" firstStartedPulling="2026-04-22 18:49:09.787206397 +0000 UTC m=+678.777473005" lastFinishedPulling="2026-04-22 18:49:18.815111051 +0000 UTC m=+687.805377672" observedRunningTime="2026-04-22 18:49:19.741826272 +0000 UTC m=+688.732092902" watchObservedRunningTime="2026-04-22 18:49:19.743643751 +0000 UTC m=+688.733910420" Apr 22 18:49:29.647843 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:29.647800 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:29.647843 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:29.647852 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:29.660277 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:29.660243 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:49:29.768530 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:49:29.768500 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:50:49.964905 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:49.964867 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl"] Apr 22 18:50:49.965412 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:49.965155 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" podUID="48c0a407-5ec4-4561-bff4-2d1b9d955b1f" containerName="main" containerID="cri-o://a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442" gracePeriod=30 Apr 22 18:50:50.215092 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.215013 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:50:50.231287 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.231254 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-dshm\") pod \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " Apr 22 18:50:50.231500 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.231317 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-model-cache\") pod \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " Apr 22 18:50:50.231500 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.231353 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-home\") pod \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " Apr 22 18:50:50.231500 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.231397 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-kserve-provision-location\") pod \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " Apr 22 18:50:50.231500 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.231433 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-tls-certs\") pod \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " Apr 22 18:50:50.231764 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.231509 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s4c7\" (UniqueName: \"kubernetes.io/projected/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-kube-api-access-9s4c7\") pod \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\" (UID: \"48c0a407-5ec4-4561-bff4-2d1b9d955b1f\") " Apr 22 18:50:50.231764 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.231653 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-home" (OuterVolumeSpecName: "home") pod "48c0a407-5ec4-4561-bff4-2d1b9d955b1f" (UID: "48c0a407-5ec4-4561-bff4-2d1b9d955b1f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:50.231764 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.231674 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-model-cache" (OuterVolumeSpecName: "model-cache") pod "48c0a407-5ec4-4561-bff4-2d1b9d955b1f" (UID: "48c0a407-5ec4-4561-bff4-2d1b9d955b1f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:50.231923 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.231803 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:50:50.231923 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.231822 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:50:50.233792 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.233747 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-dshm" (OuterVolumeSpecName: "dshm") pod "48c0a407-5ec4-4561-bff4-2d1b9d955b1f" (UID: "48c0a407-5ec4-4561-bff4-2d1b9d955b1f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:50.233913 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.233849 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-kube-api-access-9s4c7" (OuterVolumeSpecName: "kube-api-access-9s4c7") pod "48c0a407-5ec4-4561-bff4-2d1b9d955b1f" (UID: "48c0a407-5ec4-4561-bff4-2d1b9d955b1f"). InnerVolumeSpecName "kube-api-access-9s4c7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:50.233913 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.233860 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "48c0a407-5ec4-4561-bff4-2d1b9d955b1f" (UID: "48c0a407-5ec4-4561-bff4-2d1b9d955b1f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:50.298334 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.298273 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "48c0a407-5ec4-4561-bff4-2d1b9d955b1f" (UID: "48c0a407-5ec4-4561-bff4-2d1b9d955b1f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:50.332660 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.332626 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:50:50.332660 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.332654 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:50:50.332660 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.332665 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:50:50.332908 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:50.332677 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9s4c7\" (UniqueName: \"kubernetes.io/projected/48c0a407-5ec4-4561-bff4-2d1b9d955b1f-kube-api-access-9s4c7\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:50:51.025833 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.025793 2578 generic.go:358] "Generic (PLEG): container finished" podID="48c0a407-5ec4-4561-bff4-2d1b9d955b1f" containerID="a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442" exitCode=0 Apr 22 18:50:51.026330 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.025867 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" Apr 22 18:50:51.026330 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.025885 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" event={"ID":"48c0a407-5ec4-4561-bff4-2d1b9d955b1f","Type":"ContainerDied","Data":"a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442"} Apr 22 18:50:51.026330 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.025936 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl" event={"ID":"48c0a407-5ec4-4561-bff4-2d1b9d955b1f","Type":"ContainerDied","Data":"a786fe9945fb64b96710f213218f8413f9ce868516dda6ca3bf1480594890704"} Apr 22 18:50:51.026330 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.025961 2578 scope.go:117] "RemoveContainer" containerID="a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442" Apr 22 18:50:51.036000 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.035970 2578 scope.go:117] "RemoveContainer" containerID="281d0549bdcd01fd82522502ced965049f3019b4823fd838bf5dbb3663ead7f6" Apr 22 18:50:51.050840 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.050813 2578 scope.go:117] "RemoveContainer" containerID="a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442" Apr 22 18:50:51.050930 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.050908 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl"] Apr 22 18:50:51.051171 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:50:51.051152 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442\": container with ID starting with a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442 not found: ID does not exist" containerID="a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442" Apr 22 18:50:51.051252 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.051179 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442"} err="failed to get container status \"a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442\": rpc error: code = NotFound desc = could not find container \"a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442\": container with ID starting with a2a2f5de1f0ed97d46d7865dd71ed053a5d08cf5ecfb83d73f6ff670c67c3442 not found: ID does not exist" Apr 22 18:50:51.051252 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.051199 2578 scope.go:117] "RemoveContainer" containerID="281d0549bdcd01fd82522502ced965049f3019b4823fd838bf5dbb3663ead7f6" Apr 22 18:50:51.051439 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:50:51.051422 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281d0549bdcd01fd82522502ced965049f3019b4823fd838bf5dbb3663ead7f6\": container with ID starting with 281d0549bdcd01fd82522502ced965049f3019b4823fd838bf5dbb3663ead7f6 not found: ID does not exist" containerID="281d0549bdcd01fd82522502ced965049f3019b4823fd838bf5dbb3663ead7f6" Apr 22 18:50:51.051500 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.051446 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281d0549bdcd01fd82522502ced965049f3019b4823fd838bf5dbb3663ead7f6"} err="failed to get container status \"281d0549bdcd01fd82522502ced965049f3019b4823fd838bf5dbb3663ead7f6\": rpc error: code = NotFound desc = could not find container \"281d0549bdcd01fd82522502ced965049f3019b4823fd838bf5dbb3663ead7f6\": container with ID starting with 281d0549bdcd01fd82522502ced965049f3019b4823fd838bf5dbb3663ead7f6 not found: ID does not exist" Apr 22 18:50:51.057811 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.057776 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-98c767f-xpsxl"] Apr 22 18:50:51.540322 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:50:51.540287 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c0a407-5ec4-4561-bff4-2d1b9d955b1f" path="/var/lib/kubelet/pods/48c0a407-5ec4-4561-bff4-2d1b9d955b1f/volumes" Apr 22 18:51:06.999538 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:06.999501 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht"] Apr 22 18:51:07.000121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:06.999882 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48c0a407-5ec4-4561-bff4-2d1b9d955b1f" containerName="main" Apr 22 18:51:07.000121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:06.999894 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c0a407-5ec4-4561-bff4-2d1b9d955b1f" containerName="main" Apr 22 18:51:07.000121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:06.999904 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48c0a407-5ec4-4561-bff4-2d1b9d955b1f" containerName="storage-initializer" Apr 22 18:51:07.000121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:06.999912 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c0a407-5ec4-4561-bff4-2d1b9d955b1f" containerName="storage-initializer" Apr 22 18:51:07.000121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:06.999992 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="48c0a407-5ec4-4561-bff4-2d1b9d955b1f" containerName="main" Apr 22 18:51:07.010236 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.010195 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.012520 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.012489 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht"] Apr 22 18:51:07.012816 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.012795 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 18:51:07.012946 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.012835 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 22 18:51:07.076339 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.076303 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.076589 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.076364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.076589 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.076519 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.076589 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.076558 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.076738 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.076588 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.076738 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.076630 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q26k\" (UniqueName: \"kubernetes.io/projected/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-kube-api-access-5q26k\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.177419 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.177372 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.177639 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.177425 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.177639 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.177484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q26k\" (UniqueName: \"kubernetes.io/projected/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-kube-api-access-5q26k\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.177639 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.177514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.177639 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.177551 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.177855 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.177748 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.177855 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.177833 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.177953 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.177923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.177990 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.177967 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.179813 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.179793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.180055 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.180036 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.188244 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.188213 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q26k\" (UniqueName: \"kubernetes.io/projected/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-kube-api-access-5q26k\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.323099 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.323005 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:51:07.458706 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:07.458675 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht"] Apr 22 18:51:07.464023 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:51:07.463978 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod509e53f5_3242_48ef_b2ae_a3fd2b1bc11a.slice/crio-d418ebe1d1fe2564845f0071428ff383bfc7bc2bb231fb703ab3ce826f201007 WatchSource:0}: Error finding container d418ebe1d1fe2564845f0071428ff383bfc7bc2bb231fb703ab3ce826f201007: Status 404 returned error can't find the container with id d418ebe1d1fe2564845f0071428ff383bfc7bc2bb231fb703ab3ce826f201007 Apr 22 18:51:08.083738 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:08.083700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" event={"ID":"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a","Type":"ContainerStarted","Data":"ad16f42b481c91f950cc63e14efec4340b127b61dcbc2f0827877282538c0846"} Apr 22 18:51:08.083738 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:08.083741 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" event={"ID":"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a","Type":"ContainerStarted","Data":"d418ebe1d1fe2564845f0071428ff383bfc7bc2bb231fb703ab3ce826f201007"} Apr 22 18:51:12.099894 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:12.099853 2578 generic.go:358] "Generic (PLEG): container finished" podID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerID="ad16f42b481c91f950cc63e14efec4340b127b61dcbc2f0827877282538c0846" exitCode=0 Apr 22 18:51:12.100278 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:12.099927 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" event={"ID":"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a","Type":"ContainerDied","Data":"ad16f42b481c91f950cc63e14efec4340b127b61dcbc2f0827877282538c0846"} Apr 22 18:51:58.269115 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:58.269024 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" event={"ID":"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a","Type":"ContainerStarted","Data":"b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb"} Apr 22 18:51:58.291847 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:51:58.291789 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" podStartSLOduration=6.39018999 podStartE2EDuration="52.291774784s" podCreationTimestamp="2026-04-22 18:51:06 +0000 UTC" firstStartedPulling="2026-04-22 18:51:12.101062834 +0000 UTC m=+801.091329441" lastFinishedPulling="2026-04-22 18:51:58.002647628 +0000 UTC m=+846.992914235" observedRunningTime="2026-04-22 18:51:58.288637849 +0000 UTC m=+847.278904479" watchObservedRunningTime="2026-04-22 18:51:58.291774784 +0000 UTC m=+847.282041413" Apr 22 18:52:07.323438 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:07.323395 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:52:07.323438 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:07.323442 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:52:07.324989 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:07.324928 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 22 18:52:17.324137 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:17.324089 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 22 18:52:27.324130 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:27.324078 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 22 18:52:37.323731 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:37.323678 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 22 18:52:47.323822 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:47.323722 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 22 18:52:49.901722 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:49.901686 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6"] Apr 22 18:52:49.904235 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:49.904213 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:49.906716 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:49.906690 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 18:52:49.915306 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:49.915276 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6"] Apr 22 18:52:49.986249 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:49.986211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-model-cache\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:49.986249 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:49.986247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:49.986454 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:49.986269 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-dshm\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:49.986454 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:49.986368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzpqv\" (UniqueName: \"kubernetes.io/projected/e902a3ad-a078-499b-8721-22ab824f5588-kube-api-access-zzpqv\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:49.986454 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:49.986413 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-home\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:49.986454 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:49.986443 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e902a3ad-a078-499b-8721-22ab824f5588-tls-certs\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.087642 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.087603 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-model-cache\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.087642 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.087642 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.087898 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.087667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-dshm\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.087898 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.087696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzpqv\" (UniqueName: \"kubernetes.io/projected/e902a3ad-a078-499b-8721-22ab824f5588-kube-api-access-zzpqv\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.087898 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.087718 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-home\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.087898 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.087733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e902a3ad-a078-499b-8721-22ab824f5588-tls-certs\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.088141 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.088108 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-model-cache\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.088199 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.088172 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-home\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.088371 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.088308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.090737 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.090710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-dshm\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.090941 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.090923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e902a3ad-a078-499b-8721-22ab824f5588-tls-certs\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.096645 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.096616 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzpqv\" (UniqueName: \"kubernetes.io/projected/e902a3ad-a078-499b-8721-22ab824f5588-kube-api-access-zzpqv\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.216035 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.215953 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:52:50.345175 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.345139 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6"] Apr 22 18:52:50.347128 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:52:50.347099 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode902a3ad_a078_499b_8721_22ab824f5588.slice/crio-d97d7cff3038065633159a272310f6c813a3c14efb943c7754f2511dff89331a WatchSource:0}: Error finding container d97d7cff3038065633159a272310f6c813a3c14efb943c7754f2511dff89331a: Status 404 returned error can't find the container with id d97d7cff3038065633159a272310f6c813a3c14efb943c7754f2511dff89331a Apr 22 18:52:50.447597 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.447562 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" event={"ID":"e902a3ad-a078-499b-8721-22ab824f5588","Type":"ContainerStarted","Data":"0ed7b8b72baaf75023fc3c065766d3893107afef5da28d4bd2e8950f06f77b68"} Apr 22 18:52:50.447597 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:50.447603 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" event={"ID":"e902a3ad-a078-499b-8721-22ab824f5588","Type":"ContainerStarted","Data":"d97d7cff3038065633159a272310f6c813a3c14efb943c7754f2511dff89331a"} Apr 22 18:52:51.477752 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:51.477721 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 18:52:51.480798 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:51.480771 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 18:52:55.472041 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:55.472004 2578 generic.go:358] "Generic (PLEG): container finished" podID="e902a3ad-a078-499b-8721-22ab824f5588" containerID="0ed7b8b72baaf75023fc3c065766d3893107afef5da28d4bd2e8950f06f77b68" exitCode=0 Apr 22 18:52:55.472433 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:55.472079 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" event={"ID":"e902a3ad-a078-499b-8721-22ab824f5588","Type":"ContainerDied","Data":"0ed7b8b72baaf75023fc3c065766d3893107afef5da28d4bd2e8950f06f77b68"} Apr 22 18:52:56.476965 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:56.476926 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" event={"ID":"e902a3ad-a078-499b-8721-22ab824f5588","Type":"ContainerStarted","Data":"62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5"} Apr 22 18:52:56.497106 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:56.497038 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" podStartSLOduration=7.497018592 podStartE2EDuration="7.497018592s" podCreationTimestamp="2026-04-22 18:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:52:56.49464928 +0000 UTC m=+905.484915934" watchObservedRunningTime="2026-04-22 18:52:56.497018592 +0000 UTC m=+905.487285223" Apr 22 18:52:57.324210 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:52:57.324156 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 22 18:53:00.216553 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:00.216510 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:53:00.216553 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:00.216556 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:53:00.228907 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:00.228874 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:53:00.502411 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:00.502329 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:53:07.323519 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:07.323433 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 22 18:53:17.324265 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:17.324207 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 22 18:53:27.324292 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:27.324230 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 22 18:53:36.796670 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:36.796630 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6"] Apr 22 18:53:36.797144 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:36.797001 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" podUID="e902a3ad-a078-499b-8721-22ab824f5588" containerName="main" containerID="cri-o://62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5" gracePeriod=30 Apr 22 18:53:37.041018 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.040994 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:53:37.104154 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.104051 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-home\") pod \"e902a3ad-a078-499b-8721-22ab824f5588\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " Apr 22 18:53:37.104154 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.104105 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e902a3ad-a078-499b-8721-22ab824f5588-tls-certs\") pod \"e902a3ad-a078-499b-8721-22ab824f5588\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " Apr 22 18:53:37.104154 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.104151 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-dshm\") pod \"e902a3ad-a078-499b-8721-22ab824f5588\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " Apr 22 18:53:37.104441 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.104194 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-kserve-provision-location\") pod \"e902a3ad-a078-499b-8721-22ab824f5588\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " Apr 22 18:53:37.104441 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.104217 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzpqv\" (UniqueName: \"kubernetes.io/projected/e902a3ad-a078-499b-8721-22ab824f5588-kube-api-access-zzpqv\") pod \"e902a3ad-a078-499b-8721-22ab824f5588\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " Apr 22 18:53:37.104441 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.104240 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-model-cache\") pod \"e902a3ad-a078-499b-8721-22ab824f5588\" (UID: \"e902a3ad-a078-499b-8721-22ab824f5588\") " Apr 22 18:53:37.104441 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.104415 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-home" (OuterVolumeSpecName: "home") pod "e902a3ad-a078-499b-8721-22ab824f5588" (UID: "e902a3ad-a078-499b-8721-22ab824f5588"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:37.104658 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.104611 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-model-cache" (OuterVolumeSpecName: "model-cache") pod "e902a3ad-a078-499b-8721-22ab824f5588" (UID: "e902a3ad-a078-499b-8721-22ab824f5588"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:37.106520 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.106489 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-dshm" (OuterVolumeSpecName: "dshm") pod "e902a3ad-a078-499b-8721-22ab824f5588" (UID: "e902a3ad-a078-499b-8721-22ab824f5588"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:37.106520 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.106509 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e902a3ad-a078-499b-8721-22ab824f5588-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e902a3ad-a078-499b-8721-22ab824f5588" (UID: "e902a3ad-a078-499b-8721-22ab824f5588"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:53:37.106671 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.106496 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e902a3ad-a078-499b-8721-22ab824f5588-kube-api-access-zzpqv" (OuterVolumeSpecName: "kube-api-access-zzpqv") pod "e902a3ad-a078-499b-8721-22ab824f5588" (UID: "e902a3ad-a078-499b-8721-22ab824f5588"). InnerVolumeSpecName "kube-api-access-zzpqv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:37.159888 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.159821 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e902a3ad-a078-499b-8721-22ab824f5588" (UID: "e902a3ad-a078-499b-8721-22ab824f5588"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:37.205750 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.205712 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.205750 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.205744 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e902a3ad-a078-499b-8721-22ab824f5588-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.205750 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.205754 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.205984 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.205764 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.205984 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.205774 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zzpqv\" (UniqueName: \"kubernetes.io/projected/e902a3ad-a078-499b-8721-22ab824f5588-kube-api-access-zzpqv\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.205984 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.205783 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e902a3ad-a078-499b-8721-22ab824f5588-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.334656 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.334621 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:53:37.342611 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.342581 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:53:37.626815 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.626774 2578 generic.go:358] "Generic (PLEG): container finished" podID="e902a3ad-a078-499b-8721-22ab824f5588" containerID="62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5" exitCode=0 Apr 22 18:53:37.627010 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.626840 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" Apr 22 18:53:37.627010 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.626858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" event={"ID":"e902a3ad-a078-499b-8721-22ab824f5588","Type":"ContainerDied","Data":"62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5"} Apr 22 18:53:37.627010 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.626896 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6" event={"ID":"e902a3ad-a078-499b-8721-22ab824f5588","Type":"ContainerDied","Data":"d97d7cff3038065633159a272310f6c813a3c14efb943c7754f2511dff89331a"} Apr 22 18:53:37.627010 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.626916 2578 scope.go:117] "RemoveContainer" containerID="62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5" Apr 22 18:53:37.635550 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.635520 2578 scope.go:117] "RemoveContainer" containerID="0ed7b8b72baaf75023fc3c065766d3893107afef5da28d4bd2e8950f06f77b68" Apr 22 18:53:37.651124 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.651095 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6"] Apr 22 18:53:37.656662 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.656630 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-zj7f6"] Apr 22 18:53:37.699044 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.699018 2578 scope.go:117] "RemoveContainer" containerID="62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5" Apr 22 18:53:37.699401 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:53:37.699371 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5\": container with ID starting with 62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5 not found: ID does not exist" containerID="62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5" Apr 22 18:53:37.699516 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.699412 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5"} err="failed to get container status \"62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5\": rpc error: code = NotFound desc = could not find container \"62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5\": container with ID starting with 62e1ea42bae9d3a15a848d0e8f9251c3e7b811307a896aad9a01699d320e70a5 not found: ID does not exist" Apr 22 18:53:37.699516 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.699434 2578 scope.go:117] "RemoveContainer" containerID="0ed7b8b72baaf75023fc3c065766d3893107afef5da28d4bd2e8950f06f77b68" Apr 22 18:53:37.699758 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:53:37.699738 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed7b8b72baaf75023fc3c065766d3893107afef5da28d4bd2e8950f06f77b68\": container with ID starting with 0ed7b8b72baaf75023fc3c065766d3893107afef5da28d4bd2e8950f06f77b68 not found: ID does not exist" containerID="0ed7b8b72baaf75023fc3c065766d3893107afef5da28d4bd2e8950f06f77b68" Apr 22 18:53:37.699801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:37.699767 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed7b8b72baaf75023fc3c065766d3893107afef5da28d4bd2e8950f06f77b68"} err="failed to get container status \"0ed7b8b72baaf75023fc3c065766d3893107afef5da28d4bd2e8950f06f77b68\": rpc error: code = NotFound desc = could not find container \"0ed7b8b72baaf75023fc3c065766d3893107afef5da28d4bd2e8950f06f77b68\": container with ID starting with 0ed7b8b72baaf75023fc3c065766d3893107afef5da28d4bd2e8950f06f77b68 not found: ID does not exist" Apr 22 18:53:39.549545 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:39.549500 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e902a3ad-a078-499b-8721-22ab824f5588" path="/var/lib/kubelet/pods/e902a3ad-a078-499b-8721-22ab824f5588/volumes" Apr 22 18:53:45.496337 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.496299 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7"] Apr 22 18:53:45.496728 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.496666 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e902a3ad-a078-499b-8721-22ab824f5588" containerName="storage-initializer" Apr 22 18:53:45.496728 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.496679 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e902a3ad-a078-499b-8721-22ab824f5588" containerName="storage-initializer" Apr 22 18:53:45.496728 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.496693 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e902a3ad-a078-499b-8721-22ab824f5588" containerName="main" Apr 22 18:53:45.496728 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.496699 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e902a3ad-a078-499b-8721-22ab824f5588" containerName="main" Apr 22 18:53:45.496860 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.496766 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e902a3ad-a078-499b-8721-22ab824f5588" containerName="main" Apr 22 18:53:45.499826 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.499806 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.502204 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.502183 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 18:53:45.509749 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.509720 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7"] Apr 22 18:53:45.579641 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.579601 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g8jl\" (UniqueName: \"kubernetes.io/projected/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-kube-api-access-8g8jl\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.579641 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.579644 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-kserve-provision-location\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.579867 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.579733 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-model-cache\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.579867 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.579772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-tls-certs\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.579867 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.579844 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-home\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.579974 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.579894 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-dshm\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.680538 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.680494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-home\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.680748 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.680552 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-dshm\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.680748 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.680588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8g8jl\" (UniqueName: \"kubernetes.io/projected/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-kube-api-access-8g8jl\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.680748 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.680621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-kserve-provision-location\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.680748 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.680668 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-model-cache\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.680748 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.680712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-tls-certs\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.681069 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.681043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-home\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.681171 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.681148 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-model-cache\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.681223 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.681160 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-kserve-provision-location\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.682982 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.682961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-dshm\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.683129 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.683112 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-tls-certs\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.690103 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.690077 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g8jl\" (UniqueName: \"kubernetes.io/projected/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-kube-api-access-8g8jl\") pod \"stop-feature-test-kserve-57f45f89c4-56jg7\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.811567 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.811437 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:45.946559 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:45.946526 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7"] Apr 22 18:53:45.948366 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:53:45.948338 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fc82869_2e1d_4a94_9b25_10ba5b0ba38e.slice/crio-a6fc8d7ab6e496fb09c11a2419f563a11f075d7337f4354a5f6c0409c13aba09 WatchSource:0}: Error finding container a6fc8d7ab6e496fb09c11a2419f563a11f075d7337f4354a5f6c0409c13aba09: Status 404 returned error can't find the container with id a6fc8d7ab6e496fb09c11a2419f563a11f075d7337f4354a5f6c0409c13aba09 Apr 22 18:53:46.664371 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:46.664330 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" event={"ID":"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e","Type":"ContainerStarted","Data":"04d2e4ddb5321f4e7ca725a0976679678859090cd21d1e08668505166729811e"} Apr 22 18:53:46.664371 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:46.664369 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" event={"ID":"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e","Type":"ContainerStarted","Data":"a6fc8d7ab6e496fb09c11a2419f563a11f075d7337f4354a5f6c0409c13aba09"} Apr 22 18:53:47.704192 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:47.704155 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht"] Apr 22 18:53:47.704712 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:47.704506 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" containerID="cri-o://b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb" gracePeriod=30 Apr 22 18:53:50.681168 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:50.681081 2578 generic.go:358] "Generic (PLEG): container finished" podID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerID="04d2e4ddb5321f4e7ca725a0976679678859090cd21d1e08668505166729811e" exitCode=0 Apr 22 18:53:50.681641 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:50.681152 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" event={"ID":"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e","Type":"ContainerDied","Data":"04d2e4ddb5321f4e7ca725a0976679678859090cd21d1e08668505166729811e"} Apr 22 18:53:50.682409 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:50.682393 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:53:51.686157 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:51.686119 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" event={"ID":"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e","Type":"ContainerStarted","Data":"a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe"} Apr 22 18:53:51.708867 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:51.708812 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podStartSLOduration=6.7087961499999995 podStartE2EDuration="6.70879615s" podCreationTimestamp="2026-04-22 18:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:53:51.707669748 +0000 UTC m=+960.697936377" watchObservedRunningTime="2026-04-22 18:53:51.70879615 +0000 UTC m=+960.699062780" Apr 22 18:53:55.812284 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:55.812238 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:55.812284 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:55.812286 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:53:55.813856 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:55.813821 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:53:58.368442 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.368407 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f"] Apr 22 18:53:58.394389 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.394351 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f"] Apr 22 18:53:58.394584 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.394538 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.397276 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.397249 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 22 18:53:58.493729 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.493687 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-home\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.493918 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.493736 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-model-cache\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.493918 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.493818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-dshm\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.493918 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.493853 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7e612217-0635-4803-82a4-f98382d32a17-tls-certs\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.493918 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.493877 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.493918 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.493910 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkwhr\" (UniqueName: \"kubernetes.io/projected/7e612217-0635-4803-82a4-f98382d32a17-kube-api-access-jkwhr\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.594961 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.594915 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-home\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.595256 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.595236 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-model-cache\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.595413 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.595389 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-dshm\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.595889 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.595869 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7e612217-0635-4803-82a4-f98382d32a17-tls-certs\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.596453 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.596432 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.596651 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.596634 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkwhr\" (UniqueName: \"kubernetes.io/projected/7e612217-0635-4803-82a4-f98382d32a17-kube-api-access-jkwhr\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.597121 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.597094 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.597251 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.596015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-home\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.597333 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.595787 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-model-cache\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.604492 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.601790 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-dshm\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.604492 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.602323 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7e612217-0635-4803-82a4-f98382d32a17-tls-certs\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.613031 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.612949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkwhr\" (UniqueName: \"kubernetes.io/projected/7e612217-0635-4803-82a4-f98382d32a17-kube-api-access-jkwhr\") pod \"custom-route-timeout-test-kserve-78bf74c48b-2sc5f\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.706916 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.706880 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:53:58.843500 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:58.843456 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f"] Apr 22 18:53:58.846544 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:53:58.846514 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e612217_0635_4803_82a4_f98382d32a17.slice/crio-a50432cb79d8db3a7fcaf320d621a27deab52f0cad5ce3e547755ebb6f9d232a WatchSource:0}: Error finding container a50432cb79d8db3a7fcaf320d621a27deab52f0cad5ce3e547755ebb6f9d232a: Status 404 returned error can't find the container with id a50432cb79d8db3a7fcaf320d621a27deab52f0cad5ce3e547755ebb6f9d232a Apr 22 18:53:59.717116 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:59.717076 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" event={"ID":"7e612217-0635-4803-82a4-f98382d32a17","Type":"ContainerStarted","Data":"7cbe07b26274b0ad413dd1e78949f411b3abe2cce38714f0972e709c467b3d2a"} Apr 22 18:53:59.717566 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:53:59.717124 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" event={"ID":"7e612217-0635-4803-82a4-f98382d32a17","Type":"ContainerStarted","Data":"a50432cb79d8db3a7fcaf320d621a27deab52f0cad5ce3e547755ebb6f9d232a"} Apr 22 18:54:03.742366 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:03.742327 2578 generic.go:358] "Generic (PLEG): container finished" podID="7e612217-0635-4803-82a4-f98382d32a17" containerID="7cbe07b26274b0ad413dd1e78949f411b3abe2cce38714f0972e709c467b3d2a" exitCode=0 Apr 22 18:54:03.742899 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:03.742408 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" event={"ID":"7e612217-0635-4803-82a4-f98382d32a17","Type":"ContainerDied","Data":"7cbe07b26274b0ad413dd1e78949f411b3abe2cce38714f0972e709c467b3d2a"} Apr 22 18:54:04.748709 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:04.748659 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" event={"ID":"7e612217-0635-4803-82a4-f98382d32a17","Type":"ContainerStarted","Data":"724c3345e7dc98be3456d19e01e69f7b82760e5383e0504ac81b05e7ba63e5be"} Apr 22 18:54:04.771346 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:04.771290 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podStartSLOduration=6.771271691 podStartE2EDuration="6.771271691s" podCreationTimestamp="2026-04-22 18:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:54:04.768632493 +0000 UTC m=+973.758899147" watchObservedRunningTime="2026-04-22 18:54:04.771271691 +0000 UTC m=+973.761538320" Apr 22 18:54:05.812158 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:05.812093 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:54:08.707408 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:08.707365 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:54:08.707919 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:08.707426 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:54:08.709054 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:08.709017 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:54:15.812529 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:15.812411 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:54:18.013591 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.013557 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht_509e53f5-3242-48ef-b2ae-a3fd2b1bc11a/main/0.log" Apr 22 18:54:18.013998 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.013989 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:54:18.058387 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.058335 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-kserve-provision-location\") pod \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " Apr 22 18:54:18.058603 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.058413 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-model-cache\") pod \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " Apr 22 18:54:18.058603 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.058480 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-tls-certs\") pod \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " Apr 22 18:54:18.058603 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.058550 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-home\") pod \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " Apr 22 18:54:18.058603 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.058577 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q26k\" (UniqueName: \"kubernetes.io/projected/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-kube-api-access-5q26k\") pod \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " Apr 22 18:54:18.058826 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.058605 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-dshm\") pod \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\" (UID: \"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a\") " Apr 22 18:54:18.058826 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.058773 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-model-cache" (OuterVolumeSpecName: "model-cache") pod "509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" (UID: "509e53f5-3242-48ef-b2ae-a3fd2b1bc11a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:18.058923 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.058886 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:54:18.058979 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.058956 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-home" (OuterVolumeSpecName: "home") pod "509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" (UID: "509e53f5-3242-48ef-b2ae-a3fd2b1bc11a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:18.060772 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.060745 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-kube-api-access-5q26k" (OuterVolumeSpecName: "kube-api-access-5q26k") pod "509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" (UID: "509e53f5-3242-48ef-b2ae-a3fd2b1bc11a"). InnerVolumeSpecName "kube-api-access-5q26k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:18.061278 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.061259 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" (UID: "509e53f5-3242-48ef-b2ae-a3fd2b1bc11a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:54:18.061359 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.061264 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-dshm" (OuterVolumeSpecName: "dshm") pod "509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" (UID: "509e53f5-3242-48ef-b2ae-a3fd2b1bc11a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:18.125292 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.125248 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" (UID: "509e53f5-3242-48ef-b2ae-a3fd2b1bc11a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:18.159832 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.159794 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:54:18.159832 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.159826 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:54:18.159832 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.159835 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5q26k\" (UniqueName: \"kubernetes.io/projected/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-kube-api-access-5q26k\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:54:18.160056 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.159845 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:54:18.160056 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.159854 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:54:18.707971 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.707930 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:54:18.803175 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.803146 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht_509e53f5-3242-48ef-b2ae-a3fd2b1bc11a/main/0.log" Apr 22 18:54:18.803574 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.803547 2578 generic.go:358] "Generic (PLEG): container finished" podID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerID="b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb" exitCode=137 Apr 22 18:54:18.803669 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.803648 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" Apr 22 18:54:18.803716 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.803658 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" event={"ID":"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a","Type":"ContainerDied","Data":"b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb"} Apr 22 18:54:18.803716 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.803709 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht" event={"ID":"509e53f5-3242-48ef-b2ae-a3fd2b1bc11a","Type":"ContainerDied","Data":"d418ebe1d1fe2564845f0071428ff383bfc7bc2bb231fb703ab3ce826f201007"} Apr 22 18:54:18.803787 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.803730 2578 scope.go:117] "RemoveContainer" containerID="b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb" Apr 22 18:54:18.825237 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.825211 2578 scope.go:117] "RemoveContainer" containerID="ad16f42b481c91f950cc63e14efec4340b127b61dcbc2f0827877282538c0846" Apr 22 18:54:18.831350 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.831323 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht"] Apr 22 18:54:18.835950 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.835926 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-d557d4484-gpvht"] Apr 22 18:54:18.836674 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.836654 2578 scope.go:117] "RemoveContainer" containerID="b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb" Apr 22 18:54:18.836972 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:54:18.836954 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb\": container with ID starting with b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb not found: ID does not exist" containerID="b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb" Apr 22 18:54:18.837065 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.836985 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb"} err="failed to get container status \"b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb\": rpc error: code = NotFound desc = could not find container \"b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb\": container with ID starting with b7565e8b2bfcbcd604e009bfe25cfa1fecdbd4c5ce300534e230ec8b2069f3bb not found: ID does not exist" Apr 22 18:54:18.837065 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.837010 2578 scope.go:117] "RemoveContainer" containerID="ad16f42b481c91f950cc63e14efec4340b127b61dcbc2f0827877282538c0846" Apr 22 18:54:18.837275 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:54:18.837252 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad16f42b481c91f950cc63e14efec4340b127b61dcbc2f0827877282538c0846\": container with ID starting with ad16f42b481c91f950cc63e14efec4340b127b61dcbc2f0827877282538c0846 not found: ID does not exist" containerID="ad16f42b481c91f950cc63e14efec4340b127b61dcbc2f0827877282538c0846" Apr 22 18:54:18.837327 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:18.837286 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad16f42b481c91f950cc63e14efec4340b127b61dcbc2f0827877282538c0846"} err="failed to get container status \"ad16f42b481c91f950cc63e14efec4340b127b61dcbc2f0827877282538c0846\": rpc error: code = NotFound desc = could not find container \"ad16f42b481c91f950cc63e14efec4340b127b61dcbc2f0827877282538c0846\": container with ID starting with ad16f42b481c91f950cc63e14efec4340b127b61dcbc2f0827877282538c0846 not found: ID does not exist" Apr 22 18:54:19.539798 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:19.539763 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" path="/var/lib/kubelet/pods/509e53f5-3242-48ef-b2ae-a3fd2b1bc11a/volumes" Apr 22 18:54:25.811962 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:25.811912 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:54:28.707601 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:28.707509 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:54:35.812285 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:35.812241 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:54:38.707374 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:38.707327 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:54:45.812788 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:45.812743 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:54:48.708337 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:48.708290 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:54:55.812735 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:55.812685 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:54:58.707856 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:54:58.707816 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:55:05.812426 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:55:05.812381 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:55:08.708345 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:55:08.708296 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:55:15.812956 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:55:15.812895 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:55:18.707637 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:55:18.707594 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:55:25.812266 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:55:25.812218 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:55:28.707756 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:55:28.707638 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:55:35.812924 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:55:35.812859 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:55:38.708345 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:55:38.708287 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:55:45.812548 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:55:45.812416 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:55:48.707797 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:55:48.707743 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:55:55.812589 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:55:55.812531 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 22 18:55:58.707629 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:55:58.707579 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:56:05.822687 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:05.822644 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:56:05.830766 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:05.830734 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:56:06.860872 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:06.860832 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7"] Apr 22 18:56:07.212835 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:07.212765 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" containerID="cri-o://a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe" gracePeriod=30 Apr 22 18:56:08.707858 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:08.707816 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 22 18:56:18.717952 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:18.717921 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:56:18.725790 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:18.725769 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:56:37.451292 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.451268 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-57f45f89c4-56jg7_9fc82869-2e1d-4a94-9b25-10ba5b0ba38e/main/0.log" Apr 22 18:56:37.451661 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.451627 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:56:37.576430 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.576340 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-dshm\") pod \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " Apr 22 18:56:37.576430 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.576389 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-model-cache\") pod \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " Apr 22 18:56:37.576430 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.576406 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-tls-certs\") pod \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " Apr 22 18:56:37.576430 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.576423 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g8jl\" (UniqueName: \"kubernetes.io/projected/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-kube-api-access-8g8jl\") pod \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " Apr 22 18:56:37.576805 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.576454 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-home\") pod \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " Apr 22 18:56:37.576805 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.576538 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-kserve-provision-location\") pod \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\" (UID: \"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e\") " Apr 22 18:56:37.576805 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.576666 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-model-cache" (OuterVolumeSpecName: "model-cache") pod "9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" (UID: "9fc82869-2e1d-4a94-9b25-10ba5b0ba38e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:56:37.576951 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.576913 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-home" (OuterVolumeSpecName: "home") pod "9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" (UID: "9fc82869-2e1d-4a94-9b25-10ba5b0ba38e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:56:37.578726 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.578682 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-dshm" (OuterVolumeSpecName: "dshm") pod "9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" (UID: "9fc82869-2e1d-4a94-9b25-10ba5b0ba38e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:56:37.578931 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.578769 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" (UID: "9fc82869-2e1d-4a94-9b25-10ba5b0ba38e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:56:37.578931 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.578784 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-kube-api-access-8g8jl" (OuterVolumeSpecName: "kube-api-access-8g8jl") pod "9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" (UID: "9fc82869-2e1d-4a94-9b25-10ba5b0ba38e"). InnerVolumeSpecName "kube-api-access-8g8jl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:56:37.638003 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.637951 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" (UID: "9fc82869-2e1d-4a94-9b25-10ba5b0ba38e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:56:37.677826 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.677782 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:56:37.677826 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.677824 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:56:37.678049 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.677838 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:56:37.678049 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.677851 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8g8jl\" (UniqueName: \"kubernetes.io/projected/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-kube-api-access-8g8jl\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:56:37.678049 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.677864 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:56:37.678049 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:37.677873 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:56:38.316088 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.316058 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-57f45f89c4-56jg7_9fc82869-2e1d-4a94-9b25-10ba5b0ba38e/main/0.log" Apr 22 18:56:38.316435 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.316410 2578 generic.go:358] "Generic (PLEG): container finished" podID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerID="a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe" exitCode=137 Apr 22 18:56:38.316540 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.316504 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" Apr 22 18:56:38.316599 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.316494 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" event={"ID":"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e","Type":"ContainerDied","Data":"a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe"} Apr 22 18:56:38.316636 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.316623 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7" event={"ID":"9fc82869-2e1d-4a94-9b25-10ba5b0ba38e","Type":"ContainerDied","Data":"a6fc8d7ab6e496fb09c11a2419f563a11f075d7337f4354a5f6c0409c13aba09"} Apr 22 18:56:38.316669 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.316645 2578 scope.go:117] "RemoveContainer" containerID="a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe" Apr 22 18:56:38.336404 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.336355 2578 scope.go:117] "RemoveContainer" containerID="04d2e4ddb5321f4e7ca725a0976679678859090cd21d1e08668505166729811e" Apr 22 18:56:38.339538 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.339514 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7"] Apr 22 18:56:38.343712 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.343683 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-56jg7"] Apr 22 18:56:38.399077 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.399049 2578 scope.go:117] "RemoveContainer" containerID="a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe" Apr 22 18:56:38.399409 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:56:38.399383 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe\": container with ID starting with a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe not found: ID does not exist" containerID="a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe" Apr 22 18:56:38.399607 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.399417 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe"} err="failed to get container status \"a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe\": rpc error: code = NotFound desc = could not find container \"a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe\": container with ID starting with a8ececf1bfb9bbbe6d2fe03b50d5a16f04c5c8b7897ba29a03b8288691c1c9fe not found: ID does not exist" Apr 22 18:56:38.399607 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.399440 2578 scope.go:117] "RemoveContainer" containerID="04d2e4ddb5321f4e7ca725a0976679678859090cd21d1e08668505166729811e" Apr 22 18:56:38.399745 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:56:38.399721 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04d2e4ddb5321f4e7ca725a0976679678859090cd21d1e08668505166729811e\": container with ID starting with 04d2e4ddb5321f4e7ca725a0976679678859090cd21d1e08668505166729811e not found: ID does not exist" containerID="04d2e4ddb5321f4e7ca725a0976679678859090cd21d1e08668505166729811e" Apr 22 18:56:38.399783 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:38.399758 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d2e4ddb5321f4e7ca725a0976679678859090cd21d1e08668505166729811e"} err="failed to get container status \"04d2e4ddb5321f4e7ca725a0976679678859090cd21d1e08668505166729811e\": rpc error: code = NotFound desc = could not find container \"04d2e4ddb5321f4e7ca725a0976679678859090cd21d1e08668505166729811e\": container with ID starting with 04d2e4ddb5321f4e7ca725a0976679678859090cd21d1e08668505166729811e not found: ID does not exist" Apr 22 18:56:39.539549 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:56:39.539514 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" path="/var/lib/kubelet/pods/9fc82869-2e1d-4a94-9b25-10ba5b0ba38e/volumes" Apr 22 18:57:05.297835 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.297800 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6"] Apr 22 18:57:05.298296 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.298185 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" Apr 22 18:57:05.298296 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.298201 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" Apr 22 18:57:05.298296 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.298210 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="storage-initializer" Apr 22 18:57:05.298296 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.298215 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="storage-initializer" Apr 22 18:57:05.298296 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.298227 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="storage-initializer" Apr 22 18:57:05.298296 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.298234 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="storage-initializer" Apr 22 18:57:05.298296 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.298241 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" Apr 22 18:57:05.298296 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.298246 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" Apr 22 18:57:05.298296 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.298297 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9fc82869-2e1d-4a94-9b25-10ba5b0ba38e" containerName="main" Apr 22 18:57:05.298602 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.298306 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="509e53f5-3242-48ef-b2ae-a3fd2b1bc11a" containerName="main" Apr 22 18:57:05.302374 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.302354 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.304867 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.304839 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 18:57:05.312373 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.312340 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6"] Apr 22 18:57:05.312870 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.312841 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2694409c-5883-4d28-a018-05aa1b9da10c-tls-certs\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.313005 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.312888 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-kserve-provision-location\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.313005 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.312924 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-dshm\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.313103 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.312994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-home\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.313103 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.313074 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-model-cache\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.313193 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.313114 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwsvv\" (UniqueName: \"kubernetes.io/projected/2694409c-5883-4d28-a018-05aa1b9da10c-kube-api-access-kwsvv\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.414180 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.414144 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2694409c-5883-4d28-a018-05aa1b9da10c-tls-certs\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.414383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.414185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-kserve-provision-location\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.414383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.414220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-dshm\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.414383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.414255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-home\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.414383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.414327 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-model-cache\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.414383 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.414363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwsvv\" (UniqueName: \"kubernetes.io/projected/2694409c-5883-4d28-a018-05aa1b9da10c-kube-api-access-kwsvv\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.414729 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.414702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-kserve-provision-location\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.414860 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.414744 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-home\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.414923 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.414892 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-model-cache\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.416746 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.416725 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-dshm\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.417133 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.417111 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2694409c-5883-4d28-a018-05aa1b9da10c-tls-certs\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.423551 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.423435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwsvv\" (UniqueName: \"kubernetes.io/projected/2694409c-5883-4d28-a018-05aa1b9da10c-kube-api-access-kwsvv\") pod \"stop-feature-test-kserve-57f45f89c4-zbqw6\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.614424 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.614336 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:05.750714 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:05.750686 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6"] Apr 22 18:57:05.753091 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:57:05.753060 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2694409c_5883_4d28_a018_05aa1b9da10c.slice/crio-5d5d006b5cb834f933c217ccb068d54993a0a69a9a6fc458090267991eee3efa WatchSource:0}: Error finding container 5d5d006b5cb834f933c217ccb068d54993a0a69a9a6fc458090267991eee3efa: Status 404 returned error can't find the container with id 5d5d006b5cb834f933c217ccb068d54993a0a69a9a6fc458090267991eee3efa Apr 22 18:57:06.418364 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:06.418321 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" event={"ID":"2694409c-5883-4d28-a018-05aa1b9da10c","Type":"ContainerStarted","Data":"895f3810ee809fa2302f4f662b08d8cf705db9776d4addcc07b161c5124c7e7c"} Apr 22 18:57:06.418866 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:06.418372 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" event={"ID":"2694409c-5883-4d28-a018-05aa1b9da10c","Type":"ContainerStarted","Data":"5d5d006b5cb834f933c217ccb068d54993a0a69a9a6fc458090267991eee3efa"} Apr 22 18:57:07.326831 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:07.326789 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f"] Apr 22 18:57:07.327183 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:07.327128 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" containerID="cri-o://724c3345e7dc98be3456d19e01e69f7b82760e5383e0504ac81b05e7ba63e5be" gracePeriod=30 Apr 22 18:57:10.433449 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:10.433411 2578 generic.go:358] "Generic (PLEG): container finished" podID="2694409c-5883-4d28-a018-05aa1b9da10c" containerID="895f3810ee809fa2302f4f662b08d8cf705db9776d4addcc07b161c5124c7e7c" exitCode=0 Apr 22 18:57:10.433912 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:10.433502 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" event={"ID":"2694409c-5883-4d28-a018-05aa1b9da10c","Type":"ContainerDied","Data":"895f3810ee809fa2302f4f662b08d8cf705db9776d4addcc07b161c5124c7e7c"} Apr 22 18:57:11.438952 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:11.438916 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" event={"ID":"2694409c-5883-4d28-a018-05aa1b9da10c","Type":"ContainerStarted","Data":"9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1"} Apr 22 18:57:11.460197 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:11.460148 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podStartSLOduration=6.460133177 podStartE2EDuration="6.460133177s" podCreationTimestamp="2026-04-22 18:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:11.459065635 +0000 UTC m=+1160.449332264" watchObservedRunningTime="2026-04-22 18:57:11.460133177 +0000 UTC m=+1160.450399805" Apr 22 18:57:15.615250 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:15.615211 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:15.615250 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:15.615256 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:57:15.616940 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:15.616904 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:57:23.079090 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.079055 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb"] Apr 22 18:57:23.111830 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.111794 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb"] Apr 22 18:57:23.111988 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.111928 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.114597 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.114571 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 22 18:57:23.167041 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.166985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfj45\" (UniqueName: \"kubernetes.io/projected/8db883c7-aba8-4294-8579-23fe05229f53-kube-api-access-pfj45\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.167041 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.167045 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-home\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.167258 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.167175 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db883c7-aba8-4294-8579-23fe05229f53-tls-certs\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.167258 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.167211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-model-cache\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.167344 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.167265 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-dshm\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.167344 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.167284 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-kserve-provision-location\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.267872 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.267828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db883c7-aba8-4294-8579-23fe05229f53-tls-certs\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.267872 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.267876 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-model-cache\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.268119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.267917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-dshm\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.268119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.267941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-kserve-provision-location\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.268119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.267977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfj45\" (UniqueName: \"kubernetes.io/projected/8db883c7-aba8-4294-8579-23fe05229f53-kube-api-access-pfj45\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.268119 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.268010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-home\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.268476 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.268434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-home\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.268570 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.268554 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-model-cache\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.268570 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.268556 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-kserve-provision-location\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.270362 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.270335 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-dshm\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.270570 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.270551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db883c7-aba8-4294-8579-23fe05229f53-tls-certs\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.276780 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.276753 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfj45\" (UniqueName: \"kubernetes.io/projected/8db883c7-aba8-4294-8579-23fe05229f53-kube-api-access-pfj45\") pod \"router-with-refs-test-kserve-75b8c8cc4b-6d9gb\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.422958 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.422912 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:23.560677 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:23.560550 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb"] Apr 22 18:57:23.563515 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:57:23.563456 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db883c7_aba8_4294_8579_23fe05229f53.slice/crio-83fb6cfbc9d1c9a6481c0ddaf469c0d45cfa8f44fb0e8f0d0743e990774fdba4 WatchSource:0}: Error finding container 83fb6cfbc9d1c9a6481c0ddaf469c0d45cfa8f44fb0e8f0d0743e990774fdba4: Status 404 returned error can't find the container with id 83fb6cfbc9d1c9a6481c0ddaf469c0d45cfa8f44fb0e8f0d0743e990774fdba4 Apr 22 18:57:24.484418 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:24.484380 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" event={"ID":"8db883c7-aba8-4294-8579-23fe05229f53","Type":"ContainerStarted","Data":"3facddb59bdfb8f391dedc9aa5737760e2a33490161f29c573f0d774e309c188"} Apr 22 18:57:24.484418 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:24.484421 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" event={"ID":"8db883c7-aba8-4294-8579-23fe05229f53","Type":"ContainerStarted","Data":"83fb6cfbc9d1c9a6481c0ddaf469c0d45cfa8f44fb0e8f0d0743e990774fdba4"} Apr 22 18:57:25.615763 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:25.615714 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:57:28.499802 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:28.499761 2578 generic.go:358] "Generic (PLEG): container finished" podID="8db883c7-aba8-4294-8579-23fe05229f53" containerID="3facddb59bdfb8f391dedc9aa5737760e2a33490161f29c573f0d774e309c188" exitCode=0 Apr 22 18:57:28.500203 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:28.499811 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" event={"ID":"8db883c7-aba8-4294-8579-23fe05229f53","Type":"ContainerDied","Data":"3facddb59bdfb8f391dedc9aa5737760e2a33490161f29c573f0d774e309c188"} Apr 22 18:57:29.505786 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:29.505748 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" event={"ID":"8db883c7-aba8-4294-8579-23fe05229f53","Type":"ContainerStarted","Data":"fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4"} Apr 22 18:57:33.423891 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:33.423843 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:33.424375 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:33.423942 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:57:33.425801 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:33.425761 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 18:57:35.615507 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:35.615444 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:57:37.538510 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.538453 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-78bf74c48b-2sc5f_7e612217-0635-4803-82a4-f98382d32a17/main/0.log" Apr 22 18:57:37.538956 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.538907 2578 generic.go:358] "Generic (PLEG): container finished" podID="7e612217-0635-4803-82a4-f98382d32a17" containerID="724c3345e7dc98be3456d19e01e69f7b82760e5383e0504ac81b05e7ba63e5be" exitCode=137 Apr 22 18:57:37.540773 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.540740 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" event={"ID":"7e612217-0635-4803-82a4-f98382d32a17","Type":"ContainerDied","Data":"724c3345e7dc98be3456d19e01e69f7b82760e5383e0504ac81b05e7ba63e5be"} Apr 22 18:57:37.625855 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.625827 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-78bf74c48b-2sc5f_7e612217-0635-4803-82a4-f98382d32a17/main/0.log" Apr 22 18:57:37.626283 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.626267 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:57:37.650528 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.650439 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podStartSLOduration=14.650422354 podStartE2EDuration="14.650422354s" podCreationTimestamp="2026-04-22 18:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:29.532176087 +0000 UTC m=+1178.522442717" watchObservedRunningTime="2026-04-22 18:57:37.650422354 +0000 UTC m=+1186.640688983" Apr 22 18:57:37.708354 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.708315 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7e612217-0635-4803-82a4-f98382d32a17-tls-certs\") pod \"7e612217-0635-4803-82a4-f98382d32a17\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " Apr 22 18:57:37.708619 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.708372 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-kserve-provision-location\") pod \"7e612217-0635-4803-82a4-f98382d32a17\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " Apr 22 18:57:37.708619 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.708417 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-dshm\") pod \"7e612217-0635-4803-82a4-f98382d32a17\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " Apr 22 18:57:37.708619 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.708437 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwhr\" (UniqueName: \"kubernetes.io/projected/7e612217-0635-4803-82a4-f98382d32a17-kube-api-access-jkwhr\") pod \"7e612217-0635-4803-82a4-f98382d32a17\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " Apr 22 18:57:37.708619 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.708476 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-model-cache\") pod \"7e612217-0635-4803-82a4-f98382d32a17\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " Apr 22 18:57:37.708619 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.708516 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-home\") pod \"7e612217-0635-4803-82a4-f98382d32a17\" (UID: \"7e612217-0635-4803-82a4-f98382d32a17\") " Apr 22 18:57:37.708899 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.708792 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-model-cache" (OuterVolumeSpecName: "model-cache") pod "7e612217-0635-4803-82a4-f98382d32a17" (UID: "7e612217-0635-4803-82a4-f98382d32a17"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:37.709025 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.708999 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-home" (OuterVolumeSpecName: "home") pod "7e612217-0635-4803-82a4-f98382d32a17" (UID: "7e612217-0635-4803-82a4-f98382d32a17"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:37.710893 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.710863 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e612217-0635-4803-82a4-f98382d32a17-kube-api-access-jkwhr" (OuterVolumeSpecName: "kube-api-access-jkwhr") pod "7e612217-0635-4803-82a4-f98382d32a17" (UID: "7e612217-0635-4803-82a4-f98382d32a17"). InnerVolumeSpecName "kube-api-access-jkwhr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:57:37.711261 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.711234 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-dshm" (OuterVolumeSpecName: "dshm") pod "7e612217-0635-4803-82a4-f98382d32a17" (UID: "7e612217-0635-4803-82a4-f98382d32a17"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:37.711578 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.711555 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e612217-0635-4803-82a4-f98382d32a17-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7e612217-0635-4803-82a4-f98382d32a17" (UID: "7e612217-0635-4803-82a4-f98382d32a17"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:57:37.771666 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.771615 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7e612217-0635-4803-82a4-f98382d32a17" (UID: "7e612217-0635-4803-82a4-f98382d32a17"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:37.809342 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.809294 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:57:37.809342 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.809343 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7e612217-0635-4803-82a4-f98382d32a17-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:57:37.809538 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.809362 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:57:37.809538 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.809377 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:57:37.809538 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.809394 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jkwhr\" (UniqueName: \"kubernetes.io/projected/7e612217-0635-4803-82a4-f98382d32a17-kube-api-access-jkwhr\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:57:37.809538 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:37.809407 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7e612217-0635-4803-82a4-f98382d32a17-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:57:38.545376 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:38.545347 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-78bf74c48b-2sc5f_7e612217-0635-4803-82a4-f98382d32a17/main/0.log" Apr 22 18:57:38.545821 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:38.545793 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" event={"ID":"7e612217-0635-4803-82a4-f98382d32a17","Type":"ContainerDied","Data":"a50432cb79d8db3a7fcaf320d621a27deab52f0cad5ce3e547755ebb6f9d232a"} Apr 22 18:57:38.545898 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:38.545823 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f" Apr 22 18:57:38.545898 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:38.545843 2578 scope.go:117] "RemoveContainer" containerID="724c3345e7dc98be3456d19e01e69f7b82760e5383e0504ac81b05e7ba63e5be" Apr 22 18:57:38.569748 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:38.569717 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f"] Apr 22 18:57:38.576509 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:38.576477 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-78bf74c48b-2sc5f"] Apr 22 18:57:38.577161 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:38.577141 2578 scope.go:117] "RemoveContainer" containerID="7cbe07b26274b0ad413dd1e78949f411b3abe2cce38714f0972e709c467b3d2a" Apr 22 18:57:39.539233 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:39.539196 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e612217-0635-4803-82a4-f98382d32a17" path="/var/lib/kubelet/pods/7e612217-0635-4803-82a4-f98382d32a17/volumes" Apr 22 18:57:43.423535 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:43.423478 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 18:57:45.615706 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:45.615657 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:57:51.503535 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:51.503501 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 18:57:51.508285 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:51.508263 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 18:57:53.423692 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:53.423642 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 18:57:55.615277 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:57:55.615222 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:58:03.423424 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:58:03.423364 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 18:58:05.615200 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:58:05.615157 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:58:13.423739 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:58:13.423686 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 18:58:15.615483 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:58:15.615421 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:58:23.423674 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:58:23.423629 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 18:58:25.615323 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:58:25.615276 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:58:33.424386 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:58:33.424329 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 18:58:35.615074 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:58:35.615022 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:58:43.423751 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:58:43.423647 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 18:58:45.615542 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:58:45.615499 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:58:53.424074 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:58:53.424020 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 18:58:55.614941 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:58:55.614889 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:59:03.423429 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:03.423372 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 18:59:05.615792 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:05.615747 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:59:13.423815 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:13.423768 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 18:59:15.624920 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:15.624886 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:59:15.632776 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:15.632730 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:59:17.411250 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:17.411205 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6"] Apr 22 18:59:17.411755 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:17.411472 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" containerID="cri-o://9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1" gracePeriod=30 Apr 22 18:59:23.424022 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:23.423969 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 18:59:33.433258 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:33.433217 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:59:33.441302 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:33.441274 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 18:59:39.808243 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:39.808207 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb"] Apr 22 18:59:39.808751 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:39.808524 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" containerID="cri-o://fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4" gracePeriod=30 Apr 22 18:59:47.662188 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.662162 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-57f45f89c4-zbqw6_2694409c-5883-4d28-a018-05aa1b9da10c/main/0.log" Apr 22 18:59:47.662612 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.662591 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:59:47.828229 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.828131 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwsvv\" (UniqueName: \"kubernetes.io/projected/2694409c-5883-4d28-a018-05aa1b9da10c-kube-api-access-kwsvv\") pod \"2694409c-5883-4d28-a018-05aa1b9da10c\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " Apr 22 18:59:47.828229 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.828211 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-dshm\") pod \"2694409c-5883-4d28-a018-05aa1b9da10c\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " Apr 22 18:59:47.828493 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.828236 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2694409c-5883-4d28-a018-05aa1b9da10c-tls-certs\") pod \"2694409c-5883-4d28-a018-05aa1b9da10c\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " Apr 22 18:59:47.828493 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.828277 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-home\") pod \"2694409c-5883-4d28-a018-05aa1b9da10c\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " Apr 22 18:59:47.828493 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.828296 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-model-cache\") pod \"2694409c-5883-4d28-a018-05aa1b9da10c\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " Apr 22 18:59:47.828493 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.828315 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-kserve-provision-location\") pod \"2694409c-5883-4d28-a018-05aa1b9da10c\" (UID: \"2694409c-5883-4d28-a018-05aa1b9da10c\") " Apr 22 18:59:47.828719 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.828617 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-model-cache" (OuterVolumeSpecName: "model-cache") pod "2694409c-5883-4d28-a018-05aa1b9da10c" (UID: "2694409c-5883-4d28-a018-05aa1b9da10c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:47.828794 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.828765 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:59:47.828901 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.828873 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-home" (OuterVolumeSpecName: "home") pod "2694409c-5883-4d28-a018-05aa1b9da10c" (UID: "2694409c-5883-4d28-a018-05aa1b9da10c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:47.830634 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.830599 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2694409c-5883-4d28-a018-05aa1b9da10c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2694409c-5883-4d28-a018-05aa1b9da10c" (UID: "2694409c-5883-4d28-a018-05aa1b9da10c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:59:47.830771 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.830634 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-dshm" (OuterVolumeSpecName: "dshm") pod "2694409c-5883-4d28-a018-05aa1b9da10c" (UID: "2694409c-5883-4d28-a018-05aa1b9da10c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:47.830771 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.830650 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2694409c-5883-4d28-a018-05aa1b9da10c-kube-api-access-kwsvv" (OuterVolumeSpecName: "kube-api-access-kwsvv") pod "2694409c-5883-4d28-a018-05aa1b9da10c" (UID: "2694409c-5883-4d28-a018-05aa1b9da10c"). InnerVolumeSpecName "kube-api-access-kwsvv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:59:47.895197 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.895154 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2694409c-5883-4d28-a018-05aa1b9da10c" (UID: "2694409c-5883-4d28-a018-05aa1b9da10c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:47.930081 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.930042 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2694409c-5883-4d28-a018-05aa1b9da10c-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:59:47.930081 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.930080 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:59:47.930274 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.930095 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:59:47.930274 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.930110 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kwsvv\" (UniqueName: \"kubernetes.io/projected/2694409c-5883-4d28-a018-05aa1b9da10c-kube-api-access-kwsvv\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:59:47.930274 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:47.930125 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2694409c-5883-4d28-a018-05aa1b9da10c-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 18:59:48.016436 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.016410 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-57f45f89c4-zbqw6_2694409c-5883-4d28-a018-05aa1b9da10c/main/0.log" Apr 22 18:59:48.016761 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.016734 2578 generic.go:358] "Generic (PLEG): container finished" podID="2694409c-5883-4d28-a018-05aa1b9da10c" containerID="9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1" exitCode=137 Apr 22 18:59:48.016846 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.016828 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" Apr 22 18:59:48.016922 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.016820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" event={"ID":"2694409c-5883-4d28-a018-05aa1b9da10c","Type":"ContainerDied","Data":"9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1"} Apr 22 18:59:48.016982 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.016938 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6" event={"ID":"2694409c-5883-4d28-a018-05aa1b9da10c","Type":"ContainerDied","Data":"5d5d006b5cb834f933c217ccb068d54993a0a69a9a6fc458090267991eee3efa"} Apr 22 18:59:48.016982 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.016961 2578 scope.go:117] "RemoveContainer" containerID="9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1" Apr 22 18:59:48.036084 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.036063 2578 scope.go:117] "RemoveContainer" containerID="895f3810ee809fa2302f4f662b08d8cf705db9776d4addcc07b161c5124c7e7c" Apr 22 18:59:48.039565 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.039538 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6"] Apr 22 18:59:48.043511 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.043485 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-57f45f89c4-zbqw6"] Apr 22 18:59:48.047203 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.047183 2578 scope.go:117] "RemoveContainer" containerID="9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1" Apr 22 18:59:48.047537 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:59:48.047508 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1\": container with ID starting with 9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1 not found: ID does not exist" containerID="9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1" Apr 22 18:59:48.047608 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.047551 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1"} err="failed to get container status \"9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1\": rpc error: code = NotFound desc = could not find container \"9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1\": container with ID starting with 9c9151104bcc831f126454bb822bf6cf5ba08c22b6c8505e4b2da18cfc43b9e1 not found: ID does not exist" Apr 22 18:59:48.047608 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.047573 2578 scope.go:117] "RemoveContainer" containerID="895f3810ee809fa2302f4f662b08d8cf705db9776d4addcc07b161c5124c7e7c" Apr 22 18:59:48.047859 ip-10-0-132-151 kubenswrapper[2578]: E0422 18:59:48.047841 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"895f3810ee809fa2302f4f662b08d8cf705db9776d4addcc07b161c5124c7e7c\": container with ID starting with 895f3810ee809fa2302f4f662b08d8cf705db9776d4addcc07b161c5124c7e7c not found: ID does not exist" containerID="895f3810ee809fa2302f4f662b08d8cf705db9776d4addcc07b161c5124c7e7c" Apr 22 18:59:48.047915 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:48.047865 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895f3810ee809fa2302f4f662b08d8cf705db9776d4addcc07b161c5124c7e7c"} err="failed to get container status \"895f3810ee809fa2302f4f662b08d8cf705db9776d4addcc07b161c5124c7e7c\": rpc error: code = NotFound desc = could not find container \"895f3810ee809fa2302f4f662b08d8cf705db9776d4addcc07b161c5124c7e7c\": container with ID starting with 895f3810ee809fa2302f4f662b08d8cf705db9776d4addcc07b161c5124c7e7c not found: ID does not exist" Apr 22 18:59:49.539206 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:49.539171 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" path="/var/lib/kubelet/pods/2694409c-5883-4d28-a018-05aa1b9da10c/volumes" Apr 22 18:59:50.158767 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.158734 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs"] Apr 22 18:59:50.159146 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.159131 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="storage-initializer" Apr 22 18:59:50.159205 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.159148 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="storage-initializer" Apr 22 18:59:50.159205 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.159158 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" Apr 22 18:59:50.159205 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.159163 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" Apr 22 18:59:50.159205 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.159178 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="storage-initializer" Apr 22 18:59:50.159205 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.159185 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="storage-initializer" Apr 22 18:59:50.159205 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.159192 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" Apr 22 18:59:50.159205 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.159198 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" Apr 22 18:59:50.159433 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.159263 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e612217-0635-4803-82a4-f98382d32a17" containerName="main" Apr 22 18:59:50.159433 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.159270 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2694409c-5883-4d28-a018-05aa1b9da10c" containerName="main" Apr 22 18:59:50.164713 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.164683 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs"] Apr 22 18:59:50.164713 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.164713 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5"] Apr 22 18:59:50.164929 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.164894 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.167443 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.167421 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-64hx9\"" Apr 22 18:59:50.167580 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.167421 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 18:59:50.168552 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.168531 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.174277 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.174251 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5"] Apr 22 18:59:50.248046 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.248007 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.248229 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.248053 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.248229 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.248117 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.248229 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.248160 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.248229 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.248181 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjbcf\" (UniqueName: \"kubernetes.io/projected/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-kube-api-access-kjbcf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.248229 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.248209 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6049d0b5-8297-4a15-97c4-f7a169955689-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.248401 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.248256 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.248401 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.248283 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks6sf\" (UniqueName: \"kubernetes.io/projected/6049d0b5-8297-4a15-97c4-f7a169955689-kube-api-access-ks6sf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.248401 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.248309 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.248401 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.248327 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.248401 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.248360 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.248401 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.248379 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.349365 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.349574 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.349574 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349406 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.349574 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349423 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjbcf\" (UniqueName: \"kubernetes.io/projected/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-kube-api-access-kjbcf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.349574 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349447 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6049d0b5-8297-4a15-97c4-f7a169955689-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.349574 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.349574 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349528 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ks6sf\" (UniqueName: \"kubernetes.io/projected/6049d0b5-8297-4a15-97c4-f7a169955689-kube-api-access-ks6sf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.349574 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.349949 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.349949 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349630 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.349949 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349657 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.349949 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.349949 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349909 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.349949 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349925 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.350262 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.349960 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.350262 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.350129 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.350262 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.350188 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.350430 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.350297 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.351988 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.351966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.352095 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.351986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.352095 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.352081 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.352212 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.352178 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6049d0b5-8297-4a15-97c4-f7a169955689-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.358189 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.358168 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjbcf\" (UniqueName: \"kubernetes.io/projected/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-kube-api-access-kjbcf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.358301 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.358283 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks6sf\" (UniqueName: \"kubernetes.io/projected/6049d0b5-8297-4a15-97c4-f7a169955689-kube-api-access-ks6sf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.478676 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.478578 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:50.485425 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.485398 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 18:59:50.620798 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.620776 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs"] Apr 22 18:59:50.623359 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:59:50.623328 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8a5578b_be24_4a48_b8b7_8ec5e8f52b10.slice/crio-a775b87277014625a398e2483bbc0b8e13b3c35ed93a0cc788dc6372e9de530f WatchSource:0}: Error finding container a775b87277014625a398e2483bbc0b8e13b3c35ed93a0cc788dc6372e9de530f: Status 404 returned error can't find the container with id a775b87277014625a398e2483bbc0b8e13b3c35ed93a0cc788dc6372e9de530f Apr 22 18:59:50.625204 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.625188 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:59:50.638366 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:50.638323 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5"] Apr 22 18:59:50.640839 ip-10-0-132-151 kubenswrapper[2578]: W0422 18:59:50.640818 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6049d0b5_8297_4a15_97c4_f7a169955689.slice/crio-43b24a15e0a85736bb4cbca050dceae92c4214c31945d186ac6eef32f1af4bba WatchSource:0}: Error finding container 43b24a15e0a85736bb4cbca050dceae92c4214c31945d186ac6eef32f1af4bba: Status 404 returned error can't find the container with id 43b24a15e0a85736bb4cbca050dceae92c4214c31945d186ac6eef32f1af4bba Apr 22 18:59:51.030181 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:51.030124 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" event={"ID":"6049d0b5-8297-4a15-97c4-f7a169955689","Type":"ContainerStarted","Data":"7b13a167989fe00642e3fc755c540220b0647644b62f4deb1d7b99520dfd02ef"} Apr 22 18:59:51.030181 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:51.030173 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" event={"ID":"6049d0b5-8297-4a15-97c4-f7a169955689","Type":"ContainerStarted","Data":"43b24a15e0a85736bb4cbca050dceae92c4214c31945d186ac6eef32f1af4bba"} Apr 22 18:59:51.031535 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:51.031492 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" event={"ID":"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10","Type":"ContainerStarted","Data":"a775b87277014625a398e2483bbc0b8e13b3c35ed93a0cc788dc6372e9de530f"} Apr 22 18:59:52.038222 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:52.038169 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" event={"ID":"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10","Type":"ContainerStarted","Data":"d5cc83e44d1e81989257d2d510c18df40f9628f938c3b01e453f795a95da1450"} Apr 22 18:59:52.038662 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:52.038410 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 18:59:53.045773 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:53.045733 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" event={"ID":"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10","Type":"ContainerStarted","Data":"f9362900cb9613660f3c558890e8dd675357c357a617c3af2b03dbfe93162298"} Apr 22 18:59:55.055859 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:55.055818 2578 generic.go:358] "Generic (PLEG): container finished" podID="6049d0b5-8297-4a15-97c4-f7a169955689" containerID="7b13a167989fe00642e3fc755c540220b0647644b62f4deb1d7b99520dfd02ef" exitCode=0 Apr 22 18:59:55.056284 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:55.055895 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" event={"ID":"6049d0b5-8297-4a15-97c4-f7a169955689","Type":"ContainerDied","Data":"7b13a167989fe00642e3fc755c540220b0647644b62f4deb1d7b99520dfd02ef"} Apr 22 18:59:55.155276 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:55.155242 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-dc64b4c84-fsldz"] Apr 22 18:59:55.155545 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:55.155515 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" podUID="d4cbe54f-548a-471a-b49b-9a8f2f8a25c3" containerName="manager" containerID="cri-o://348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09" gracePeriod=30 Apr 22 18:59:56.062624 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:56.062585 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" event={"ID":"6049d0b5-8297-4a15-97c4-f7a169955689","Type":"ContainerStarted","Data":"65e1736e7ae2037b708a27b555c7968c686b713612721e21c956ecaac422a852"} Apr 22 18:59:56.088562 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:56.088504 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podStartSLOduration=6.088487028 podStartE2EDuration="6.088487028s" podCreationTimestamp="2026-04-22 18:59:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:59:56.087646786 +0000 UTC m=+1325.077913450" watchObservedRunningTime="2026-04-22 18:59:56.088487028 +0000 UTC m=+1325.078753657" Apr 22 18:59:57.067738 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:57.067701 2578 generic.go:358] "Generic (PLEG): container finished" podID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerID="f9362900cb9613660f3c558890e8dd675357c357a617c3af2b03dbfe93162298" exitCode=0 Apr 22 18:59:57.068155 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:57.067766 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" event={"ID":"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10","Type":"ContainerDied","Data":"f9362900cb9613660f3c558890e8dd675357c357a617c3af2b03dbfe93162298"} Apr 22 18:59:58.074226 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:58.074183 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" event={"ID":"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10","Type":"ContainerStarted","Data":"e69674a1bf8bf692d5246927a9fc34d8c3b54112626c6b62b6ff294b010639cf"} Apr 22 18:59:58.097584 ip-10-0-132-151 kubenswrapper[2578]: I0422 18:59:58.097494 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podStartSLOduration=7.162124418 podStartE2EDuration="8.0974476s" podCreationTimestamp="2026-04-22 18:59:50 +0000 UTC" firstStartedPulling="2026-04-22 18:59:50.62532684 +0000 UTC m=+1319.615593451" lastFinishedPulling="2026-04-22 18:59:51.560650016 +0000 UTC m=+1320.550916633" observedRunningTime="2026-04-22 18:59:58.095869226 +0000 UTC m=+1327.086135879" watchObservedRunningTime="2026-04-22 18:59:58.0974476 +0000 UTC m=+1327.087714210" Apr 22 19:00:00.479031 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:00.478976 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 19:00:00.479031 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:00.479030 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 19:00:00.480615 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:00.480582 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:00:00.485865 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:00.485839 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 19:00:00.486007 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:00.485884 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 19:00:00.487137 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:00.487101 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:00:01.300586 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:01.300561 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" Apr 22 19:00:01.354474 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:01.354424 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3-cert\") pod \"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3\" (UID: \"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3\") " Apr 22 19:00:01.354626 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:01.354556 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxq6t\" (UniqueName: \"kubernetes.io/projected/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3-kube-api-access-sxq6t\") pod \"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3\" (UID: \"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3\") " Apr 22 19:00:01.356637 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:01.356589 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3-cert" (OuterVolumeSpecName: "cert") pod "d4cbe54f-548a-471a-b49b-9a8f2f8a25c3" (UID: "d4cbe54f-548a-471a-b49b-9a8f2f8a25c3"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:00:01.356755 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:01.356647 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3-kube-api-access-sxq6t" (OuterVolumeSpecName: "kube-api-access-sxq6t") pod "d4cbe54f-548a-471a-b49b-9a8f2f8a25c3" (UID: "d4cbe54f-548a-471a-b49b-9a8f2f8a25c3"). InnerVolumeSpecName "kube-api-access-sxq6t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:00:01.456032 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:01.455989 2578 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3-cert\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:00:01.456032 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:01.456027 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sxq6t\" (UniqueName: \"kubernetes.io/projected/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3-kube-api-access-sxq6t\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:00:02.088904 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:02.088860 2578 generic.go:358] "Generic (PLEG): container finished" podID="d4cbe54f-548a-471a-b49b-9a8f2f8a25c3" containerID="348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09" exitCode=0 Apr 22 19:00:02.089408 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:02.088927 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" Apr 22 19:00:02.089408 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:02.088928 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" event={"ID":"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3","Type":"ContainerDied","Data":"348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09"} Apr 22 19:00:02.089408 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:02.089045 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-dc64b4c84-fsldz" event={"ID":"d4cbe54f-548a-471a-b49b-9a8f2f8a25c3","Type":"ContainerDied","Data":"7de378c261e2c73b9ece649262c34154e1c2294090aaff47bf5ed2ded2ee9c6e"} Apr 22 19:00:02.089408 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:02.089068 2578 scope.go:117] "RemoveContainer" containerID="348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09" Apr 22 19:00:02.099298 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:02.099275 2578 scope.go:117] "RemoveContainer" containerID="348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09" Apr 22 19:00:02.099675 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:00:02.099642 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09\": container with ID starting with 348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09 not found: ID does not exist" containerID="348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09" Apr 22 19:00:02.099786 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:02.099686 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09"} err="failed to get container status \"348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09\": rpc error: code = NotFound desc = could not find container \"348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09\": container with ID starting with 348fef1ea5665d78d6f4c9f100a797ed3391a6f4f7f585041cee605a66fd1f09 not found: ID does not exist" Apr 22 19:00:02.106193 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:02.106162 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-dc64b4c84-fsldz"] Apr 22 19:00:02.110010 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:02.109979 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-dc64b4c84-fsldz"] Apr 22 19:00:03.540312 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:03.540273 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4cbe54f-548a-471a-b49b-9a8f2f8a25c3" path="/var/lib/kubelet/pods/d4cbe54f-548a-471a-b49b-9a8f2f8a25c3/volumes" Apr 22 19:00:10.102307 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.102277 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-75b8c8cc4b-6d9gb_8db883c7-aba8-4294-8579-23fe05229f53/main/0.log" Apr 22 19:00:10.102705 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.102689 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 19:00:10.121003 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.120976 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-75b8c8cc4b-6d9gb_8db883c7-aba8-4294-8579-23fe05229f53/main/0.log" Apr 22 19:00:10.121354 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.121328 2578 generic.go:358] "Generic (PLEG): container finished" podID="8db883c7-aba8-4294-8579-23fe05229f53" containerID="fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4" exitCode=137 Apr 22 19:00:10.121484 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.121378 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" event={"ID":"8db883c7-aba8-4294-8579-23fe05229f53","Type":"ContainerDied","Data":"fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4"} Apr 22 19:00:10.121484 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.121405 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" Apr 22 19:00:10.121484 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.121421 2578 scope.go:117] "RemoveContainer" containerID="fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4" Apr 22 19:00:10.121655 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.121406 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb" event={"ID":"8db883c7-aba8-4294-8579-23fe05229f53","Type":"ContainerDied","Data":"83fb6cfbc9d1c9a6481c0ddaf469c0d45cfa8f44fb0e8f0d0743e990774fdba4"} Apr 22 19:00:10.151584 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.151557 2578 scope.go:117] "RemoveContainer" containerID="3facddb59bdfb8f391dedc9aa5737760e2a33490161f29c573f0d774e309c188" Apr 22 19:00:10.221545 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.221520 2578 scope.go:117] "RemoveContainer" containerID="fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4" Apr 22 19:00:10.221920 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:00:10.221899 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4\": container with ID starting with fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4 not found: ID does not exist" containerID="fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4" Apr 22 19:00:10.221995 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.221930 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4"} err="failed to get container status \"fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4\": rpc error: code = NotFound desc = could not find container \"fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4\": container with ID starting with fc437eb72b3b732b51802a353da8256e41c225e200beeb6ba5329427a9d154a4 not found: ID does not exist" Apr 22 19:00:10.221995 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.221951 2578 scope.go:117] "RemoveContainer" containerID="3facddb59bdfb8f391dedc9aa5737760e2a33490161f29c573f0d774e309c188" Apr 22 19:00:10.222249 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:00:10.222232 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3facddb59bdfb8f391dedc9aa5737760e2a33490161f29c573f0d774e309c188\": container with ID starting with 3facddb59bdfb8f391dedc9aa5737760e2a33490161f29c573f0d774e309c188 not found: ID does not exist" containerID="3facddb59bdfb8f391dedc9aa5737760e2a33490161f29c573f0d774e309c188" Apr 22 19:00:10.222291 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.222261 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3facddb59bdfb8f391dedc9aa5737760e2a33490161f29c573f0d774e309c188"} err="failed to get container status \"3facddb59bdfb8f391dedc9aa5737760e2a33490161f29c573f0d774e309c188\": rpc error: code = NotFound desc = could not find container \"3facddb59bdfb8f391dedc9aa5737760e2a33490161f29c573f0d774e309c188\": container with ID starting with 3facddb59bdfb8f391dedc9aa5737760e2a33490161f29c573f0d774e309c188 not found: ID does not exist" Apr 22 19:00:10.240726 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.240698 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-home\") pod \"8db883c7-aba8-4294-8579-23fe05229f53\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " Apr 22 19:00:10.240828 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.240747 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfj45\" (UniqueName: \"kubernetes.io/projected/8db883c7-aba8-4294-8579-23fe05229f53-kube-api-access-pfj45\") pod \"8db883c7-aba8-4294-8579-23fe05229f53\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " Apr 22 19:00:10.240828 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.240785 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-model-cache\") pod \"8db883c7-aba8-4294-8579-23fe05229f53\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " Apr 22 19:00:10.240968 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.240851 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-kserve-provision-location\") pod \"8db883c7-aba8-4294-8579-23fe05229f53\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " Apr 22 19:00:10.240968 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.240899 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-dshm\") pod \"8db883c7-aba8-4294-8579-23fe05229f53\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " Apr 22 19:00:10.240968 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.240950 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db883c7-aba8-4294-8579-23fe05229f53-tls-certs\") pod \"8db883c7-aba8-4294-8579-23fe05229f53\" (UID: \"8db883c7-aba8-4294-8579-23fe05229f53\") " Apr 22 19:00:10.241268 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.241127 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-home" (OuterVolumeSpecName: "home") pod "8db883c7-aba8-4294-8579-23fe05229f53" (UID: "8db883c7-aba8-4294-8579-23fe05229f53"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:10.241268 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.241163 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-model-cache" (OuterVolumeSpecName: "model-cache") pod "8db883c7-aba8-4294-8579-23fe05229f53" (UID: "8db883c7-aba8-4294-8579-23fe05229f53"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:10.241268 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.241187 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:00:10.243130 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.243105 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db883c7-aba8-4294-8579-23fe05229f53-kube-api-access-pfj45" (OuterVolumeSpecName: "kube-api-access-pfj45") pod "8db883c7-aba8-4294-8579-23fe05229f53" (UID: "8db883c7-aba8-4294-8579-23fe05229f53"). InnerVolumeSpecName "kube-api-access-pfj45". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:00:10.243605 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.243585 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-dshm" (OuterVolumeSpecName: "dshm") pod "8db883c7-aba8-4294-8579-23fe05229f53" (UID: "8db883c7-aba8-4294-8579-23fe05229f53"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:10.243686 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.243645 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db883c7-aba8-4294-8579-23fe05229f53-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8db883c7-aba8-4294-8579-23fe05229f53" (UID: "8db883c7-aba8-4294-8579-23fe05229f53"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:00:10.302343 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.302291 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8db883c7-aba8-4294-8579-23fe05229f53" (UID: "8db883c7-aba8-4294-8579-23fe05229f53"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:10.342294 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.342188 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pfj45\" (UniqueName: \"kubernetes.io/projected/8db883c7-aba8-4294-8579-23fe05229f53-kube-api-access-pfj45\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:00:10.342294 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.342230 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:00:10.342294 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.342246 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:00:10.342294 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.342265 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8db883c7-aba8-4294-8579-23fe05229f53-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:00:10.342294 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.342279 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8db883c7-aba8-4294-8579-23fe05229f53-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:00:10.448916 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.448863 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb"] Apr 22 19:00:10.454899 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.454864 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-75b8c8cc4b-6d9gb"] Apr 22 19:00:10.479643 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.479592 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:00:10.486344 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.486296 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:00:10.492618 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:10.492594 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 19:00:11.539586 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:11.539546 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db883c7-aba8-4294-8579-23fe05229f53" path="/var/lib/kubelet/pods/8db883c7-aba8-4294-8579-23fe05229f53/volumes" Apr 22 19:00:20.479413 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:20.479354 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:00:20.486012 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:20.485961 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:00:30.479535 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:30.479482 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:00:30.486298 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:30.486254 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:00:40.479870 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:40.479811 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:00:40.486372 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:40.486332 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:00:50.479920 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:50.479862 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:00:50.486677 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:00:50.486615 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:01:00.229133 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.229092 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc"] Apr 22 19:01:00.229609 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.229451 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4cbe54f-548a-471a-b49b-9a8f2f8a25c3" containerName="manager" Apr 22 19:01:00.229609 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.229478 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cbe54f-548a-471a-b49b-9a8f2f8a25c3" containerName="manager" Apr 22 19:01:00.229609 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.229490 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" Apr 22 19:01:00.229609 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.229496 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" Apr 22 19:01:00.229609 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.229514 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="storage-initializer" Apr 22 19:01:00.229609 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.229519 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="storage-initializer" Apr 22 19:01:00.229609 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.229576 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8db883c7-aba8-4294-8579-23fe05229f53" containerName="main" Apr 22 19:01:00.229609 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.229585 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4cbe54f-548a-471a-b49b-9a8f2f8a25c3" containerName="manager" Apr 22 19:01:00.234069 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.234038 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.236579 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.236551 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 22 19:01:00.245336 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.245301 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc"] Apr 22 19:01:00.382986 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.382950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw7b8\" (UniqueName: \"kubernetes.io/projected/c5c9c054-061b-421f-a220-c4f073642276-kube-api-access-dw7b8\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.383164 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.383000 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.383164 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.383035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5c9c054-061b-421f-a220-c4f073642276-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.383164 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.383070 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.383164 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.383102 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.383357 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.383167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.480112 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.479998 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:01:00.484587 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.484545 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.484749 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.484602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.484749 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.484644 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.484749 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.484693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw7b8\" (UniqueName: \"kubernetes.io/projected/c5c9c054-061b-421f-a220-c4f073642276-kube-api-access-dw7b8\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.484749 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.484730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.484979 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.484757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5c9c054-061b-421f-a220-c4f073642276-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.485095 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.485058 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.485169 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.485082 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.485338 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.485311 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.485827 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.485792 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:01:00.487232 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.487207 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.487593 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.487572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5c9c054-061b-421f-a220-c4f073642276-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.492667 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.492638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw7b8\" (UniqueName: \"kubernetes.io/projected/c5c9c054-061b-421f-a220-c4f073642276-kube-api-access-dw7b8\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.546838 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.546795 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:00.693864 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:00.693802 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc"] Apr 22 19:01:00.697578 ip-10-0-132-151 kubenswrapper[2578]: W0422 19:01:00.697534 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5c9c054_061b_421f_a220_c4f073642276.slice/crio-847d350c46324e0bc624975e3fbd5cbd98d13ed16307782a9a03656ac79a1e49 WatchSource:0}: Error finding container 847d350c46324e0bc624975e3fbd5cbd98d13ed16307782a9a03656ac79a1e49: Status 404 returned error can't find the container with id 847d350c46324e0bc624975e3fbd5cbd98d13ed16307782a9a03656ac79a1e49 Apr 22 19:01:01.313685 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:01.313646 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" event={"ID":"c5c9c054-061b-421f-a220-c4f073642276","Type":"ContainerStarted","Data":"e3e7d5da70be83f7946a75ea9700db87feee95bdfe99ff4f690d074ed2e50e38"} Apr 22 19:01:01.313685 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:01.313690 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" event={"ID":"c5c9c054-061b-421f-a220-c4f073642276","Type":"ContainerStarted","Data":"847d350c46324e0bc624975e3fbd5cbd98d13ed16307782a9a03656ac79a1e49"} Apr 22 19:01:05.331030 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:05.330996 2578 generic.go:358] "Generic (PLEG): container finished" podID="c5c9c054-061b-421f-a220-c4f073642276" containerID="e3e7d5da70be83f7946a75ea9700db87feee95bdfe99ff4f690d074ed2e50e38" exitCode=0 Apr 22 19:01:05.331378 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:05.331073 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" event={"ID":"c5c9c054-061b-421f-a220-c4f073642276","Type":"ContainerDied","Data":"e3e7d5da70be83f7946a75ea9700db87feee95bdfe99ff4f690d074ed2e50e38"} Apr 22 19:01:06.337108 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:06.337074 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" event={"ID":"c5c9c054-061b-421f-a220-c4f073642276","Type":"ContainerStarted","Data":"cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c"} Apr 22 19:01:06.361004 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:06.360951 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podStartSLOduration=6.36093187 podStartE2EDuration="6.36093187s" podCreationTimestamp="2026-04-22 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:01:06.357594451 +0000 UTC m=+1395.347861079" watchObservedRunningTime="2026-04-22 19:01:06.36093187 +0000 UTC m=+1395.351198498" Apr 22 19:01:10.479100 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:10.479048 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:01:10.486185 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:10.486141 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:01:10.547787 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:10.547749 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:10.547996 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:10.547920 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:01:10.549838 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:10.549799 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:01:20.480048 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:20.480002 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:01:20.486697 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:20.486660 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:01:20.547987 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:20.547939 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:01:30.480098 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:30.480047 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:01:30.486570 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:30.486530 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:01:30.547526 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:30.547485 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:01:40.479717 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:40.479603 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:01:40.486284 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:40.486244 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:01:40.547340 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:40.547286 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:01:50.479771 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:50.479713 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:01:50.486384 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:50.486347 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:01:50.547771 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:01:50.547714 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:02:00.480073 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:00.480030 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:02:00.486473 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:00.486424 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:02:00.547483 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:00.547420 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:02:10.479778 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:10.479729 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:02:10.486162 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:10.486118 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 22 19:02:10.547865 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:10.547818 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:02:20.479743 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:20.479686 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8001/health\": dial tcp 10.134.0.42:8001: connect: connection refused" Apr 22 19:02:20.496050 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:20.496012 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 19:02:20.504322 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:20.504288 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 19:02:20.547803 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:20.547760 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:02:30.491179 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:30.491128 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 19:02:30.507061 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:30.507028 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 19:02:30.548130 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:30.548086 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:02:40.548718 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:40.548666 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:02:49.555907 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:49.555864 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs"] Apr 22 19:02:49.556511 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:49.556368 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" containerID="cri-o://e69674a1bf8bf692d5246927a9fc34d8c3b54112626c6b62b6ff294b010639cf" gracePeriod=30 Apr 22 19:02:49.563813 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:49.563785 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5"] Apr 22 19:02:49.564181 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:49.564122 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" containerID="cri-o://65e1736e7ae2037b708a27b555c7968c686b713612721e21c956ecaac422a852" gracePeriod=30 Apr 22 19:02:50.548247 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:50.548205 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:02:51.534919 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:51.534887 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 19:02:51.541707 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:51.541684 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 19:02:58.203170 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.203130 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9"] Apr 22 19:02:58.209853 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.209828 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.212309 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.212268 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-4zr88\"" Apr 22 19:02:58.212504 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.212310 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 19:02:58.217226 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.217193 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9"] Apr 22 19:02:58.222149 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.222116 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5"] Apr 22 19:02:58.227275 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.227246 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.235907 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.235878 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5"] Apr 22 19:02:58.253565 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.253525 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-dshm\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.253742 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.253580 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.253742 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.253693 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.253832 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.253742 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5j8t\" (UniqueName: \"kubernetes.io/projected/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-kube-api-access-m5j8t\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.253832 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.253815 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-home\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.253903 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.253860 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-model-cache\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.354741 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.354694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-home\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.354741 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.354738 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-model-cache\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.355025 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.354774 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtxc6\" (UniqueName: \"kubernetes.io/projected/9ac1c96c-e418-41db-817d-45b6972b1f45-kube-api-access-rtxc6\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.355025 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.354796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-dshm\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.355025 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.354817 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.355025 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.354845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ac1c96c-e418-41db-817d-45b6972b1f45-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.355025 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.354873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.355268 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.355052 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.355268 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.355102 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.355268 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.355149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-home\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.355268 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.355163 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.355268 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.355176 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-model-cache\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.355268 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.355219 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5j8t\" (UniqueName: \"kubernetes.io/projected/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-kube-api-access-m5j8t\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.355268 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.355254 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.355268 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.355255 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.357162 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.357139 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-dshm\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.357421 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.357399 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.363791 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.363762 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5j8t\" (UniqueName: \"kubernetes.io/projected/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-kube-api-access-m5j8t\") pod \"custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.456509 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.456405 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.456509 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.456475 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.456509 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.456503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.456786 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.456537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtxc6\" (UniqueName: \"kubernetes.io/projected/9ac1c96c-e418-41db-817d-45b6972b1f45-kube-api-access-rtxc6\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.456786 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.456568 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.456786 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.456596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ac1c96c-e418-41db-817d-45b6972b1f45-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.456937 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.456874 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.456937 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.456905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.457064 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.457040 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.458987 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.458965 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.459403 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.459378 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ac1c96c-e418-41db-817d-45b6972b1f45-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.470256 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.470217 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtxc6\" (UniqueName: \"kubernetes.io/projected/9ac1c96c-e418-41db-817d-45b6972b1f45-kube-api-access-rtxc6\") pod \"custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.522723 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.522680 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:58.540659 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.540629 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:02:58.669286 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.669048 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9"] Apr 22 19:02:58.672111 ip-10-0-132-151 kubenswrapper[2578]: W0422 19:02:58.672064 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd327f30c_13b0_4f3d_b62f_22a8ea77ac77.slice/crio-d6e99dc9cd5c0ddeccf191b657a0b71a586e7a5f9265f33bd9d3c00b240210c3 WatchSource:0}: Error finding container d6e99dc9cd5c0ddeccf191b657a0b71a586e7a5f9265f33bd9d3c00b240210c3: Status 404 returned error can't find the container with id d6e99dc9cd5c0ddeccf191b657a0b71a586e7a5f9265f33bd9d3c00b240210c3 Apr 22 19:02:58.696806 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.696770 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5"] Apr 22 19:02:58.706044 ip-10-0-132-151 kubenswrapper[2578]: W0422 19:02:58.705999 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ac1c96c_e418_41db_817d_45b6972b1f45.slice/crio-2e547d077a24966e35929a64a825c381486667593c55825e945043c272fd7b6f WatchSource:0}: Error finding container 2e547d077a24966e35929a64a825c381486667593c55825e945043c272fd7b6f: Status 404 returned error can't find the container with id 2e547d077a24966e35929a64a825c381486667593c55825e945043c272fd7b6f Apr 22 19:02:58.740451 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.740234 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" event={"ID":"9ac1c96c-e418-41db-817d-45b6972b1f45","Type":"ContainerStarted","Data":"2e547d077a24966e35929a64a825c381486667593c55825e945043c272fd7b6f"} Apr 22 19:02:58.741968 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.741938 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" event={"ID":"d327f30c-13b0-4f3d-b62f-22a8ea77ac77","Type":"ContainerStarted","Data":"db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593"} Apr 22 19:02:58.742114 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.741976 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" event={"ID":"d327f30c-13b0-4f3d-b62f-22a8ea77ac77","Type":"ContainerStarted","Data":"d6e99dc9cd5c0ddeccf191b657a0b71a586e7a5f9265f33bd9d3c00b240210c3"} Apr 22 19:02:58.742190 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:58.742150 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:02:59.747654 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:59.747612 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" event={"ID":"9ac1c96c-e418-41db-817d-45b6972b1f45","Type":"ContainerStarted","Data":"914c4b11ea6e7da79d22f250b031931630235afe3a48219f9aad524e769e75e7"} Apr 22 19:02:59.749334 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:02:59.749301 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" event={"ID":"d327f30c-13b0-4f3d-b62f-22a8ea77ac77","Type":"ContainerStarted","Data":"7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69"} Apr 22 19:03:00.547617 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:00.547565 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:03:03.771713 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:03.771680 2578 generic.go:358] "Generic (PLEG): container finished" podID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerID="914c4b11ea6e7da79d22f250b031931630235afe3a48219f9aad524e769e75e7" exitCode=0 Apr 22 19:03:03.772287 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:03.771761 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" event={"ID":"9ac1c96c-e418-41db-817d-45b6972b1f45","Type":"ContainerDied","Data":"914c4b11ea6e7da79d22f250b031931630235afe3a48219f9aad524e769e75e7"} Apr 22 19:03:03.773516 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:03.773494 2578 generic.go:358] "Generic (PLEG): container finished" podID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerID="7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69" exitCode=0 Apr 22 19:03:03.773615 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:03.773563 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" event={"ID":"d327f30c-13b0-4f3d-b62f-22a8ea77ac77","Type":"ContainerDied","Data":"7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69"} Apr 22 19:03:04.779381 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:04.779337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" event={"ID":"9ac1c96c-e418-41db-817d-45b6972b1f45","Type":"ContainerStarted","Data":"7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2"} Apr 22 19:03:04.781665 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:04.781639 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" event={"ID":"d327f30c-13b0-4f3d-b62f-22a8ea77ac77","Type":"ContainerStarted","Data":"b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19"} Apr 22 19:03:04.802159 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:04.802103 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podStartSLOduration=6.80208387 podStartE2EDuration="6.80208387s" podCreationTimestamp="2026-04-22 19:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:03:04.800159306 +0000 UTC m=+1513.790425959" watchObservedRunningTime="2026-04-22 19:03:04.80208387 +0000 UTC m=+1513.792350498" Apr 22 19:03:04.822842 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:04.822713 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podStartSLOduration=6.822694714 podStartE2EDuration="6.822694714s" podCreationTimestamp="2026-04-22 19:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:03:04.819098941 +0000 UTC m=+1513.809365578" watchObservedRunningTime="2026-04-22 19:03:04.822694714 +0000 UTC m=+1513.812961341" Apr 22 19:03:08.523373 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:08.523330 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:03:08.523373 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:08.523384 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:03:08.524787 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:08.524747 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:03:08.541539 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:08.541502 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:03:08.541731 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:08.541681 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:03:08.543011 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:08.542974 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:03:08.543374 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:08.543354 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:03:10.558491 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:10.558384 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:03:10.567576 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:10.567541 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:03:18.523582 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:18.523536 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:03:18.541606 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:18.541560 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:03:19.557235 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.557159 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="llm-d-routing-sidecar" containerID="cri-o://d5cc83e44d1e81989257d2d510c18df40f9628f938c3b01e453f795a95da1450" gracePeriod=2 Apr 22 19:03:19.841567 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.841312 2578 generic.go:358] "Generic (PLEG): container finished" podID="6049d0b5-8297-4a15-97c4-f7a169955689" containerID="65e1736e7ae2037b708a27b555c7968c686b713612721e21c956ecaac422a852" exitCode=137 Apr 22 19:03:19.841567 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.841419 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" event={"ID":"6049d0b5-8297-4a15-97c4-f7a169955689","Type":"ContainerDied","Data":"65e1736e7ae2037b708a27b555c7968c686b713612721e21c956ecaac422a852"} Apr 22 19:03:19.844378 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.844345 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs_f8a5578b-be24-4a48-b8b7-8ec5e8f52b10/main/0.log" Apr 22 19:03:19.845212 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.845184 2578 generic.go:358] "Generic (PLEG): container finished" podID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerID="e69674a1bf8bf692d5246927a9fc34d8c3b54112626c6b62b6ff294b010639cf" exitCode=137 Apr 22 19:03:19.845311 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.845213 2578 generic.go:358] "Generic (PLEG): container finished" podID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerID="d5cc83e44d1e81989257d2d510c18df40f9628f938c3b01e453f795a95da1450" exitCode=0 Apr 22 19:03:19.845311 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.845258 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" event={"ID":"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10","Type":"ContainerDied","Data":"e69674a1bf8bf692d5246927a9fc34d8c3b54112626c6b62b6ff294b010639cf"} Apr 22 19:03:19.845311 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.845300 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" event={"ID":"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10","Type":"ContainerDied","Data":"d5cc83e44d1e81989257d2d510c18df40f9628f938c3b01e453f795a95da1450"} Apr 22 19:03:19.861903 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.861879 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs_f8a5578b-be24-4a48-b8b7-8ec5e8f52b10/main/0.log" Apr 22 19:03:19.862602 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.862582 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 19:03:19.896510 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.896434 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 19:03:19.973178 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.973145 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-model-cache\") pod \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " Apr 22 19:03:19.973178 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.973195 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-dshm\") pod \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " Apr 22 19:03:19.973480 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.973260 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-home\") pod \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " Apr 22 19:03:19.973480 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.973307 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjbcf\" (UniqueName: \"kubernetes.io/projected/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-kube-api-access-kjbcf\") pod \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " Apr 22 19:03:19.973480 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.973372 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-tls-certs\") pod \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " Apr 22 19:03:19.973480 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.973398 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-kserve-provision-location\") pod \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\" (UID: \"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10\") " Apr 22 19:03:19.973480 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.973435 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-model-cache" (OuterVolumeSpecName: "model-cache") pod "f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" (UID: "f8a5578b-be24-4a48-b8b7-8ec5e8f52b10"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:19.973742 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.973686 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-home" (OuterVolumeSpecName: "home") pod "f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" (UID: "f8a5578b-be24-4a48-b8b7-8ec5e8f52b10"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:19.973742 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.973712 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:03:19.975767 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.975691 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" (UID: "f8a5578b-be24-4a48-b8b7-8ec5e8f52b10"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:03:19.975948 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.975823 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-kube-api-access-kjbcf" (OuterVolumeSpecName: "kube-api-access-kjbcf") pod "f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" (UID: "f8a5578b-be24-4a48-b8b7-8ec5e8f52b10"). InnerVolumeSpecName "kube-api-access-kjbcf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:03:19.976095 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:19.976068 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-dshm" (OuterVolumeSpecName: "dshm") pod "f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" (UID: "f8a5578b-be24-4a48-b8b7-8ec5e8f52b10"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:20.030764 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.030705 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" (UID: "f8a5578b-be24-4a48-b8b7-8ec5e8f52b10"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:20.074702 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.074652 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-kserve-provision-location\") pod \"6049d0b5-8297-4a15-97c4-f7a169955689\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " Apr 22 19:03:20.074895 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.074737 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6049d0b5-8297-4a15-97c4-f7a169955689-tls-certs\") pod \"6049d0b5-8297-4a15-97c4-f7a169955689\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " Apr 22 19:03:20.074895 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.074790 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks6sf\" (UniqueName: \"kubernetes.io/projected/6049d0b5-8297-4a15-97c4-f7a169955689-kube-api-access-ks6sf\") pod \"6049d0b5-8297-4a15-97c4-f7a169955689\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " Apr 22 19:03:20.074895 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.074856 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-dshm\") pod \"6049d0b5-8297-4a15-97c4-f7a169955689\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " Apr 22 19:03:20.075071 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.074905 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-model-cache\") pod \"6049d0b5-8297-4a15-97c4-f7a169955689\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " Apr 22 19:03:20.075071 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.074934 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-home\") pod \"6049d0b5-8297-4a15-97c4-f7a169955689\" (UID: \"6049d0b5-8297-4a15-97c4-f7a169955689\") " Apr 22 19:03:20.075224 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.075120 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-model-cache" (OuterVolumeSpecName: "model-cache") pod "6049d0b5-8297-4a15-97c4-f7a169955689" (UID: "6049d0b5-8297-4a15-97c4-f7a169955689"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:20.075286 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.075223 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kjbcf\" (UniqueName: \"kubernetes.io/projected/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-kube-api-access-kjbcf\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:03:20.075286 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.075243 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:03:20.075286 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.075259 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:03:20.075286 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.075274 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:03:20.075512 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.075289 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:03:20.075571 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.075501 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-home" (OuterVolumeSpecName: "home") pod "6049d0b5-8297-4a15-97c4-f7a169955689" (UID: "6049d0b5-8297-4a15-97c4-f7a169955689"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:20.077522 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.077488 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6049d0b5-8297-4a15-97c4-f7a169955689-kube-api-access-ks6sf" (OuterVolumeSpecName: "kube-api-access-ks6sf") pod "6049d0b5-8297-4a15-97c4-f7a169955689" (UID: "6049d0b5-8297-4a15-97c4-f7a169955689"). InnerVolumeSpecName "kube-api-access-ks6sf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:03:20.077685 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.077520 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6049d0b5-8297-4a15-97c4-f7a169955689-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6049d0b5-8297-4a15-97c4-f7a169955689" (UID: "6049d0b5-8297-4a15-97c4-f7a169955689"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:03:20.077760 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.077720 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-dshm" (OuterVolumeSpecName: "dshm") pod "6049d0b5-8297-4a15-97c4-f7a169955689" (UID: "6049d0b5-8297-4a15-97c4-f7a169955689"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:20.139159 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.139112 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6049d0b5-8297-4a15-97c4-f7a169955689" (UID: "6049d0b5-8297-4a15-97c4-f7a169955689"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:20.176511 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.176447 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:03:20.176511 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.176518 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:03:20.176690 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.176533 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:03:20.176690 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.176547 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6049d0b5-8297-4a15-97c4-f7a169955689-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:03:20.176690 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.176563 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ks6sf\" (UniqueName: \"kubernetes.io/projected/6049d0b5-8297-4a15-97c4-f7a169955689-kube-api-access-ks6sf\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:03:20.176690 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.176575 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6049d0b5-8297-4a15-97c4-f7a169955689-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:03:20.850721 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.850625 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" event={"ID":"6049d0b5-8297-4a15-97c4-f7a169955689","Type":"ContainerDied","Data":"43b24a15e0a85736bb4cbca050dceae92c4214c31945d186ac6eef32f1af4bba"} Apr 22 19:03:20.850721 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.850685 2578 scope.go:117] "RemoveContainer" containerID="65e1736e7ae2037b708a27b555c7968c686b713612721e21c956ecaac422a852" Apr 22 19:03:20.851272 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.850639 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5" Apr 22 19:03:20.852189 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.852168 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs_f8a5578b-be24-4a48-b8b7-8ec5e8f52b10/main/0.log" Apr 22 19:03:20.853061 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.853036 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" event={"ID":"f8a5578b-be24-4a48-b8b7-8ec5e8f52b10","Type":"ContainerDied","Data":"a775b87277014625a398e2483bbc0b8e13b3c35ed93a0cc788dc6372e9de530f"} Apr 22 19:03:20.853188 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.853086 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs" Apr 22 19:03:20.873492 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.873445 2578 scope.go:117] "RemoveContainer" containerID="7b13a167989fe00642e3fc755c540220b0647644b62f4deb1d7b99520dfd02ef" Apr 22 19:03:20.879150 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.879111 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5"] Apr 22 19:03:20.882782 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.882752 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bkwnw5"] Apr 22 19:03:20.899058 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.899025 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs"] Apr 22 19:03:20.904884 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.904853 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5db6ddbb95fwnhs"] Apr 22 19:03:20.961896 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.961863 2578 scope.go:117] "RemoveContainer" containerID="e69674a1bf8bf692d5246927a9fc34d8c3b54112626c6b62b6ff294b010639cf" Apr 22 19:03:20.991547 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:20.991515 2578 scope.go:117] "RemoveContainer" containerID="f9362900cb9613660f3c558890e8dd675357c357a617c3af2b03dbfe93162298" Apr 22 19:03:21.056530 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:21.056505 2578 scope.go:117] "RemoveContainer" containerID="d5cc83e44d1e81989257d2d510c18df40f9628f938c3b01e453f795a95da1450" Apr 22 19:03:21.540167 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:21.540128 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" path="/var/lib/kubelet/pods/6049d0b5-8297-4a15-97c4-f7a169955689/volumes" Apr 22 19:03:21.540810 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:21.540790 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" path="/var/lib/kubelet/pods/f8a5578b-be24-4a48-b8b7-8ec5e8f52b10/volumes" Apr 22 19:03:28.524097 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:28.524042 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:03:28.542515 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:28.542447 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:03:38.524080 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:38.524034 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:03:38.541757 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:38.541702 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:03:42.604922 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:42.604878 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc"] Apr 22 19:03:42.605338 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:42.605275 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" containerID="cri-o://cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c" gracePeriod=30 Apr 22 19:03:48.523691 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.523637 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:03:48.541342 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.541284 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:03:48.844093 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844003 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:03:48.844389 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844373 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" Apr 22 19:03:48.844389 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844388 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" Apr 22 19:03:48.844597 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844403 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="storage-initializer" Apr 22 19:03:48.844597 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844413 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="storage-initializer" Apr 22 19:03:48.844597 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844438 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="llm-d-routing-sidecar" Apr 22 19:03:48.844597 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844447 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="llm-d-routing-sidecar" Apr 22 19:03:48.844597 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844471 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="storage-initializer" Apr 22 19:03:48.844597 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844476 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="storage-initializer" Apr 22 19:03:48.844597 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844485 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" Apr 22 19:03:48.844597 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844490 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" Apr 22 19:03:48.844597 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844551 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6049d0b5-8297-4a15-97c4-f7a169955689" containerName="main" Apr 22 19:03:48.844597 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844560 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="main" Apr 22 19:03:48.844597 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.844570 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8a5578b-be24-4a48-b8b7-8ec5e8f52b10" containerName="llm-d-routing-sidecar" Apr 22 19:03:48.846575 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.846558 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:48.848982 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.848955 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-5k2nv\"" Apr 22 19:03:48.849993 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.849966 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 19:03:48.857597 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.857572 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:03:48.951500 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.951446 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:48.951712 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.951512 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc8b7\" (UniqueName: \"kubernetes.io/projected/f3c2c47f-0d54-44cb-b674-b366b08fd393-kube-api-access-gc8b7\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:48.951712 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.951584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c2c47f-0d54-44cb-b674-b366b08fd393-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:48.951712 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.951625 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:48.951712 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.951650 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:48.951712 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:48.951685 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.052542 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.052492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gc8b7\" (UniqueName: \"kubernetes.io/projected/f3c2c47f-0d54-44cb-b674-b366b08fd393-kube-api-access-gc8b7\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.052724 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.052590 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c2c47f-0d54-44cb-b674-b366b08fd393-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.052724 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.052630 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.052724 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.052661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.052724 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.052699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.052970 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.052762 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.053057 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.053027 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.053200 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.053162 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.053200 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.053185 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.055552 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.055522 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.055718 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.055703 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c2c47f-0d54-44cb-b674-b366b08fd393-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.060787 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.060759 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc8b7\" (UniqueName: \"kubernetes.io/projected/f3c2c47f-0d54-44cb-b674-b366b08fd393-kube-api-access-gc8b7\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.158787 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.158743 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:49.309501 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.301971 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:03:49.970553 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.970511 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f3c2c47f-0d54-44cb-b674-b366b08fd393","Type":"ContainerStarted","Data":"7e505b3529882f39909d2a69f2078f159e5016def76b0a5ba96b403b814848e2"} Apr 22 19:03:49.970553 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:49.970551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f3c2c47f-0d54-44cb-b674-b366b08fd393","Type":"ContainerStarted","Data":"cb166c0b36e9ecfbbc29d9d53125f419b0f48be1141e9c1b002fe6a27f717e30"} Apr 22 19:03:54.995041 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:54.994999 2578 generic.go:358] "Generic (PLEG): container finished" podID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerID="7e505b3529882f39909d2a69f2078f159e5016def76b0a5ba96b403b814848e2" exitCode=0 Apr 22 19:03:54.995440 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:54.995072 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f3c2c47f-0d54-44cb-b674-b366b08fd393","Type":"ContainerDied","Data":"7e505b3529882f39909d2a69f2078f159e5016def76b0a5ba96b403b814848e2"} Apr 22 19:03:56.000753 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:56.000707 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f3c2c47f-0d54-44cb-b674-b366b08fd393","Type":"ContainerStarted","Data":"5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8"} Apr 22 19:03:56.020919 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:56.020854 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=8.020832769 podStartE2EDuration="8.020832769s" podCreationTimestamp="2026-04-22 19:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:03:56.020742208 +0000 UTC m=+1565.011008837" watchObservedRunningTime="2026-04-22 19:03:56.020832769 +0000 UTC m=+1565.011099399" Apr 22 19:03:58.523519 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:58.523453 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:03:58.541391 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:58.541343 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:03:59.158989 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:59.158942 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:03:59.160668 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:03:59.160626 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:04:08.524124 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:08.524037 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:04:08.541078 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:08.541028 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:04:09.159887 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:09.159838 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:04:12.936494 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.936454 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc_c5c9c054-061b-421f-a220-c4f073642276/main/0.log" Apr 22 19:04:12.936955 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.936933 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:04:12.962379 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.962349 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw7b8\" (UniqueName: \"kubernetes.io/projected/c5c9c054-061b-421f-a220-c4f073642276-kube-api-access-dw7b8\") pod \"c5c9c054-061b-421f-a220-c4f073642276\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " Apr 22 19:04:12.962583 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.962399 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5c9c054-061b-421f-a220-c4f073642276-tls-certs\") pod \"c5c9c054-061b-421f-a220-c4f073642276\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " Apr 22 19:04:12.962583 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.962435 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-home\") pod \"c5c9c054-061b-421f-a220-c4f073642276\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " Apr 22 19:04:12.962710 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.962614 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-dshm\") pod \"c5c9c054-061b-421f-a220-c4f073642276\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " Apr 22 19:04:12.962710 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.962689 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-kserve-provision-location\") pod \"c5c9c054-061b-421f-a220-c4f073642276\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " Apr 22 19:04:12.962825 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.962738 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-model-cache\") pod \"c5c9c054-061b-421f-a220-c4f073642276\" (UID: \"c5c9c054-061b-421f-a220-c4f073642276\") " Apr 22 19:04:12.963499 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.963336 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-home" (OuterVolumeSpecName: "home") pod "c5c9c054-061b-421f-a220-c4f073642276" (UID: "c5c9c054-061b-421f-a220-c4f073642276"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:04:12.963874 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.963835 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-model-cache" (OuterVolumeSpecName: "model-cache") pod "c5c9c054-061b-421f-a220-c4f073642276" (UID: "c5c9c054-061b-421f-a220-c4f073642276"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:04:12.966195 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.966158 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-dshm" (OuterVolumeSpecName: "dshm") pod "c5c9c054-061b-421f-a220-c4f073642276" (UID: "c5c9c054-061b-421f-a220-c4f073642276"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:04:12.966195 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.966167 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c9c054-061b-421f-a220-c4f073642276-kube-api-access-dw7b8" (OuterVolumeSpecName: "kube-api-access-dw7b8") pod "c5c9c054-061b-421f-a220-c4f073642276" (UID: "c5c9c054-061b-421f-a220-c4f073642276"). InnerVolumeSpecName "kube-api-access-dw7b8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:04:12.966511 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:12.966481 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5c9c054-061b-421f-a220-c4f073642276-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c5c9c054-061b-421f-a220-c4f073642276" (UID: "c5c9c054-061b-421f-a220-c4f073642276"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:04:13.050505 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.050437 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c5c9c054-061b-421f-a220-c4f073642276" (UID: "c5c9c054-061b-421f-a220-c4f073642276"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:04:13.063578 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.063542 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5c9c054-061b-421f-a220-c4f073642276-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:04:13.063578 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.063576 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:04:13.063845 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.063590 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:04:13.063845 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.063607 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:04:13.063845 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.063622 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5c9c054-061b-421f-a220-c4f073642276-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:04:13.063845 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.063636 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dw7b8\" (UniqueName: \"kubernetes.io/projected/c5c9c054-061b-421f-a220-c4f073642276-kube-api-access-dw7b8\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:04:13.065376 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.065342 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc_c5c9c054-061b-421f-a220-c4f073642276/main/0.log" Apr 22 19:04:13.065743 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.065702 2578 generic.go:358] "Generic (PLEG): container finished" podID="c5c9c054-061b-421f-a220-c4f073642276" containerID="cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c" exitCode=137 Apr 22 19:04:13.065843 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.065789 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" Apr 22 19:04:13.065843 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.065794 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" event={"ID":"c5c9c054-061b-421f-a220-c4f073642276","Type":"ContainerDied","Data":"cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c"} Apr 22 19:04:13.065972 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.065849 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc" event={"ID":"c5c9c054-061b-421f-a220-c4f073642276","Type":"ContainerDied","Data":"847d350c46324e0bc624975e3fbd5cbd98d13ed16307782a9a03656ac79a1e49"} Apr 22 19:04:13.065972 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.065874 2578 scope.go:117] "RemoveContainer" containerID="cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c" Apr 22 19:04:13.084231 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.084212 2578 scope.go:117] "RemoveContainer" containerID="e3e7d5da70be83f7946a75ea9700db87feee95bdfe99ff4f690d074ed2e50e38" Apr 22 19:04:13.090306 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.090277 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc"] Apr 22 19:04:13.094249 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.094223 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6dff989fcbqvvvc"] Apr 22 19:04:13.095341 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.095325 2578 scope.go:117] "RemoveContainer" containerID="cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c" Apr 22 19:04:13.095681 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:04:13.095652 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c\": container with ID starting with cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c not found: ID does not exist" containerID="cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c" Apr 22 19:04:13.095783 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.095693 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c"} err="failed to get container status \"cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c\": rpc error: code = NotFound desc = could not find container \"cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c\": container with ID starting with cd2ee370c60f1a9acbc475097a7b642da961a7580d86c2a92902dd33d63ca20c not found: ID does not exist" Apr 22 19:04:13.095783 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.095720 2578 scope.go:117] "RemoveContainer" containerID="e3e7d5da70be83f7946a75ea9700db87feee95bdfe99ff4f690d074ed2e50e38" Apr 22 19:04:13.096051 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:04:13.096026 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3e7d5da70be83f7946a75ea9700db87feee95bdfe99ff4f690d074ed2e50e38\": container with ID starting with e3e7d5da70be83f7946a75ea9700db87feee95bdfe99ff4f690d074ed2e50e38 not found: ID does not exist" containerID="e3e7d5da70be83f7946a75ea9700db87feee95bdfe99ff4f690d074ed2e50e38" Apr 22 19:04:13.096131 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.096058 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3e7d5da70be83f7946a75ea9700db87feee95bdfe99ff4f690d074ed2e50e38"} err="failed to get container status \"e3e7d5da70be83f7946a75ea9700db87feee95bdfe99ff4f690d074ed2e50e38\": rpc error: code = NotFound desc = could not find container \"e3e7d5da70be83f7946a75ea9700db87feee95bdfe99ff4f690d074ed2e50e38\": container with ID starting with e3e7d5da70be83f7946a75ea9700db87feee95bdfe99ff4f690d074ed2e50e38 not found: ID does not exist" Apr 22 19:04:13.540366 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:13.540323 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5c9c054-061b-421f-a220-c4f073642276" path="/var/lib/kubelet/pods/c5c9c054-061b-421f-a220-c4f073642276/volumes" Apr 22 19:04:18.523907 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:18.523857 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:04:18.541750 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:18.541696 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:04:19.159875 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:19.159828 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:04:19.160204 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:19.160169 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:04:28.523733 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:28.523678 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:04:28.541739 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:28.541704 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:04:29.159872 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:29.159831 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:04:38.523282 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:38.523230 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:04:38.542069 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:38.542022 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:04:39.160005 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:39.159957 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:04:48.523979 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:48.523925 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:04:48.541839 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:48.541798 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:04:49.159960 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:49.159910 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:04:58.523447 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:58.523389 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:04:58.540979 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:58.540936 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:04:59.160190 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:04:59.160138 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:05:08.524131 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:08.524075 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:05:08.541122 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:08.541078 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:05:09.159290 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:09.159238 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:05:18.523690 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:18.523641 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:05:18.541346 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:18.541293 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:05:19.159505 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:19.159434 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:05:28.523477 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:28.523404 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:05:28.541091 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:28.541044 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:05:29.160143 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:29.160089 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:05:38.523736 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:38.523686 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 22 19:05:38.541625 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:38.541579 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:05:39.159485 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:39.159425 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:05:48.533903 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:48.533866 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:05:48.551922 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:48.551884 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:05:48.552114 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:48.551985 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:05:48.561503 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:48.561446 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:05:49.159426 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:49.159377 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:05:59.159913 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:05:59.159864 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:06:07.897534 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:07.897500 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9"] Apr 22 19:06:07.898202 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:07.898144 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" containerID="cri-o://b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19" gracePeriod=30 Apr 22 19:06:07.916720 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:07.916680 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5"] Apr 22 19:06:07.917020 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:07.916989 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" containerID="cri-o://7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2" gracePeriod=30 Apr 22 19:06:09.159405 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:09.159362 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:06:19.169391 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:19.169359 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:06:19.177310 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:19.177267 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:06:37.898384 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:37.898297 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="llm-d-routing-sidecar" containerID="cri-o://db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593" gracePeriod=2 Apr 22 19:06:38.192218 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.192191 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9_d327f30c-13b0-4f3d-b62f-22a8ea77ac77/main/0.log" Apr 22 19:06:38.192905 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.192884 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:06:38.212737 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.212705 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:06:38.268953 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.268917 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-home\") pod \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " Apr 22 19:06:38.268953 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.268966 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtxc6\" (UniqueName: \"kubernetes.io/projected/9ac1c96c-e418-41db-817d-45b6972b1f45-kube-api-access-rtxc6\") pod \"9ac1c96c-e418-41db-817d-45b6972b1f45\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " Apr 22 19:06:38.269218 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.268995 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-model-cache\") pod \"9ac1c96c-e418-41db-817d-45b6972b1f45\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " Apr 22 19:06:38.269218 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269014 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-home\") pod \"9ac1c96c-e418-41db-817d-45b6972b1f45\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " Apr 22 19:06:38.269218 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269046 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-kserve-provision-location\") pod \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " Apr 22 19:06:38.269218 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269065 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-tls-certs\") pod \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " Apr 22 19:06:38.269218 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269088 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-dshm\") pod \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " Apr 22 19:06:38.269218 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269109 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-dshm\") pod \"9ac1c96c-e418-41db-817d-45b6972b1f45\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " Apr 22 19:06:38.269218 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269134 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-model-cache\") pod \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " Apr 22 19:06:38.269218 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269186 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5j8t\" (UniqueName: \"kubernetes.io/projected/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-kube-api-access-m5j8t\") pod \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\" (UID: \"d327f30c-13b0-4f3d-b62f-22a8ea77ac77\") " Apr 22 19:06:38.269218 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269213 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-kserve-provision-location\") pod \"9ac1c96c-e418-41db-817d-45b6972b1f45\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " Apr 22 19:06:38.269705 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269259 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ac1c96c-e418-41db-817d-45b6972b1f45-tls-certs\") pod \"9ac1c96c-e418-41db-817d-45b6972b1f45\" (UID: \"9ac1c96c-e418-41db-817d-45b6972b1f45\") " Apr 22 19:06:38.269705 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269309 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-model-cache" (OuterVolumeSpecName: "model-cache") pod "9ac1c96c-e418-41db-817d-45b6972b1f45" (UID: "9ac1c96c-e418-41db-817d-45b6972b1f45"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:38.269705 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269381 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-home" (OuterVolumeSpecName: "home") pod "d327f30c-13b0-4f3d-b62f-22a8ea77ac77" (UID: "d327f30c-13b0-4f3d-b62f-22a8ea77ac77"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:38.269705 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269454 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-home" (OuterVolumeSpecName: "home") pod "9ac1c96c-e418-41db-817d-45b6972b1f45" (UID: "9ac1c96c-e418-41db-817d-45b6972b1f45"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:38.269705 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269577 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-model-cache" (OuterVolumeSpecName: "model-cache") pod "d327f30c-13b0-4f3d-b62f-22a8ea77ac77" (UID: "d327f30c-13b0-4f3d-b62f-22a8ea77ac77"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:38.269705 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269690 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:38.270028 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269715 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:38.270028 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269733 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:38.270028 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.269745 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:38.271834 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.271793 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d327f30c-13b0-4f3d-b62f-22a8ea77ac77" (UID: "d327f30c-13b0-4f3d-b62f-22a8ea77ac77"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:38.271993 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.271839 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-dshm" (OuterVolumeSpecName: "dshm") pod "d327f30c-13b0-4f3d-b62f-22a8ea77ac77" (UID: "d327f30c-13b0-4f3d-b62f-22a8ea77ac77"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:38.271993 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.271912 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac1c96c-e418-41db-817d-45b6972b1f45-kube-api-access-rtxc6" (OuterVolumeSpecName: "kube-api-access-rtxc6") pod "9ac1c96c-e418-41db-817d-45b6972b1f45" (UID: "9ac1c96c-e418-41db-817d-45b6972b1f45"). InnerVolumeSpecName "kube-api-access-rtxc6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:38.271993 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.271938 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-kube-api-access-m5j8t" (OuterVolumeSpecName: "kube-api-access-m5j8t") pod "d327f30c-13b0-4f3d-b62f-22a8ea77ac77" (UID: "d327f30c-13b0-4f3d-b62f-22a8ea77ac77"). InnerVolumeSpecName "kube-api-access-m5j8t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:38.272184 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.272044 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-dshm" (OuterVolumeSpecName: "dshm") pod "9ac1c96c-e418-41db-817d-45b6972b1f45" (UID: "9ac1c96c-e418-41db-817d-45b6972b1f45"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:38.272745 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.272724 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac1c96c-e418-41db-817d-45b6972b1f45-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9ac1c96c-e418-41db-817d-45b6972b1f45" (UID: "9ac1c96c-e418-41db-817d-45b6972b1f45"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:38.328512 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.328445 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9ac1c96c-e418-41db-817d-45b6972b1f45" (UID: "9ac1c96c-e418-41db-817d-45b6972b1f45"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:38.329172 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.329146 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d327f30c-13b0-4f3d-b62f-22a8ea77ac77" (UID: "d327f30c-13b0-4f3d-b62f-22a8ea77ac77"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:38.370514 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.370474 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:38.370514 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.370508 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:38.370514 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.370518 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:38.370514 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.370526 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:38.370795 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.370536 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5j8t\" (UniqueName: \"kubernetes.io/projected/d327f30c-13b0-4f3d-b62f-22a8ea77ac77-kube-api-access-m5j8t\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:38.370795 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.370545 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ac1c96c-e418-41db-817d-45b6972b1f45-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:38.370795 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.370554 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ac1c96c-e418-41db-817d-45b6972b1f45-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:38.370795 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.370565 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtxc6\" (UniqueName: \"kubernetes.io/projected/9ac1c96c-e418-41db-817d-45b6972b1f45-kube-api-access-rtxc6\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:38.587333 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.587226 2578 generic.go:358] "Generic (PLEG): container finished" podID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerID="7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2" exitCode=137 Apr 22 19:06:38.587531 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.587315 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" event={"ID":"9ac1c96c-e418-41db-817d-45b6972b1f45","Type":"ContainerDied","Data":"7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2"} Apr 22 19:06:38.587531 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.587379 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" event={"ID":"9ac1c96c-e418-41db-817d-45b6972b1f45","Type":"ContainerDied","Data":"2e547d077a24966e35929a64a825c381486667593c55825e945043c272fd7b6f"} Apr 22 19:06:38.587531 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.587394 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5" Apr 22 19:06:38.587531 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.587415 2578 scope.go:117] "RemoveContainer" containerID="7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2" Apr 22 19:06:38.588787 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.588771 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9_d327f30c-13b0-4f3d-b62f-22a8ea77ac77/main/0.log" Apr 22 19:06:38.589559 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.589535 2578 generic.go:358] "Generic (PLEG): container finished" podID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerID="b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19" exitCode=137 Apr 22 19:06:38.589644 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.589560 2578 generic.go:358] "Generic (PLEG): container finished" podID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerID="db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593" exitCode=0 Apr 22 19:06:38.589644 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.589600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" event={"ID":"d327f30c-13b0-4f3d-b62f-22a8ea77ac77","Type":"ContainerDied","Data":"b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19"} Apr 22 19:06:38.589644 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.589625 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" event={"ID":"d327f30c-13b0-4f3d-b62f-22a8ea77ac77","Type":"ContainerDied","Data":"db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593"} Apr 22 19:06:38.589644 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.589637 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" Apr 22 19:06:38.589776 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.589639 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9" event={"ID":"d327f30c-13b0-4f3d-b62f-22a8ea77ac77","Type":"ContainerDied","Data":"d6e99dc9cd5c0ddeccf191b657a0b71a586e7a5f9265f33bd9d3c00b240210c3"} Apr 22 19:06:38.607930 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.607906 2578 scope.go:117] "RemoveContainer" containerID="914c4b11ea6e7da79d22f250b031931630235afe3a48219f9aad524e769e75e7" Apr 22 19:06:38.612593 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.612563 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5"] Apr 22 19:06:38.615699 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.615672 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-76b56d7d75-m7zl5"] Apr 22 19:06:38.627938 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.627905 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9"] Apr 22 19:06:38.633256 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.633223 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7b4b8c44cd-p5xf9"] Apr 22 19:06:38.671568 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.671451 2578 scope.go:117] "RemoveContainer" containerID="7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2" Apr 22 19:06:38.671892 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:06:38.671866 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2\": container with ID starting with 7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2 not found: ID does not exist" containerID="7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2" Apr 22 19:06:38.671998 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.671898 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2"} err="failed to get container status \"7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2\": rpc error: code = NotFound desc = could not find container \"7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2\": container with ID starting with 7a65672ac2120a0d1ebfa2ec30ae76b54124b58bf43da156f5d2d96b42db63b2 not found: ID does not exist" Apr 22 19:06:38.671998 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.671918 2578 scope.go:117] "RemoveContainer" containerID="914c4b11ea6e7da79d22f250b031931630235afe3a48219f9aad524e769e75e7" Apr 22 19:06:38.672182 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:06:38.672165 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"914c4b11ea6e7da79d22f250b031931630235afe3a48219f9aad524e769e75e7\": container with ID starting with 914c4b11ea6e7da79d22f250b031931630235afe3a48219f9aad524e769e75e7 not found: ID does not exist" containerID="914c4b11ea6e7da79d22f250b031931630235afe3a48219f9aad524e769e75e7" Apr 22 19:06:38.672230 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.672187 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"914c4b11ea6e7da79d22f250b031931630235afe3a48219f9aad524e769e75e7"} err="failed to get container status \"914c4b11ea6e7da79d22f250b031931630235afe3a48219f9aad524e769e75e7\": rpc error: code = NotFound desc = could not find container \"914c4b11ea6e7da79d22f250b031931630235afe3a48219f9aad524e769e75e7\": container with ID starting with 914c4b11ea6e7da79d22f250b031931630235afe3a48219f9aad524e769e75e7 not found: ID does not exist" Apr 22 19:06:38.672230 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.672200 2578 scope.go:117] "RemoveContainer" containerID="b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19" Apr 22 19:06:38.691269 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.691245 2578 scope.go:117] "RemoveContainer" containerID="7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69" Apr 22 19:06:38.752361 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.752338 2578 scope.go:117] "RemoveContainer" containerID="db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593" Apr 22 19:06:38.760205 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.760178 2578 scope.go:117] "RemoveContainer" containerID="b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19" Apr 22 19:06:38.760563 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:06:38.760530 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19\": container with ID starting with b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19 not found: ID does not exist" containerID="b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19" Apr 22 19:06:38.760641 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.760571 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19"} err="failed to get container status \"b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19\": rpc error: code = NotFound desc = could not find container \"b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19\": container with ID starting with b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19 not found: ID does not exist" Apr 22 19:06:38.760641 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.760592 2578 scope.go:117] "RemoveContainer" containerID="7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69" Apr 22 19:06:38.760890 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:06:38.760870 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69\": container with ID starting with 7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69 not found: ID does not exist" containerID="7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69" Apr 22 19:06:38.760933 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.760896 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69"} err="failed to get container status \"7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69\": rpc error: code = NotFound desc = could not find container \"7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69\": container with ID starting with 7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69 not found: ID does not exist" Apr 22 19:06:38.760933 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.760914 2578 scope.go:117] "RemoveContainer" containerID="db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593" Apr 22 19:06:38.761185 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:06:38.761169 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593\": container with ID starting with db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593 not found: ID does not exist" containerID="db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593" Apr 22 19:06:38.761247 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.761189 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593"} err="failed to get container status \"db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593\": rpc error: code = NotFound desc = could not find container \"db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593\": container with ID starting with db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593 not found: ID does not exist" Apr 22 19:06:38.761247 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.761207 2578 scope.go:117] "RemoveContainer" containerID="b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19" Apr 22 19:06:38.761442 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.761420 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19"} err="failed to get container status \"b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19\": rpc error: code = NotFound desc = could not find container \"b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19\": container with ID starting with b70d43ae8630e0b5af11e57235fa913f81523699f0c1998c80b0acd6b0c46b19 not found: ID does not exist" Apr 22 19:06:38.761546 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.761444 2578 scope.go:117] "RemoveContainer" containerID="7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69" Apr 22 19:06:38.761719 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.761699 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69"} err="failed to get container status \"7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69\": rpc error: code = NotFound desc = could not find container \"7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69\": container with ID starting with 7930c7dd752e9d5063cf9ea97870f4bb21864183a3f09dcf56d52ac9a5cc8a69 not found: ID does not exist" Apr 22 19:06:38.761791 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.761721 2578 scope.go:117] "RemoveContainer" containerID="db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593" Apr 22 19:06:38.761968 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:38.761948 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593"} err="failed to get container status \"db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593\": rpc error: code = NotFound desc = could not find container \"db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593\": container with ID starting with db8da86140ac4be673202acd654c379106ee464e0973892e3cc2fadbe4e01593 not found: ID does not exist" Apr 22 19:06:39.541735 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:39.541694 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" path="/var/lib/kubelet/pods/9ac1c96c-e418-41db-817d-45b6972b1f45/volumes" Apr 22 19:06:39.542477 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:39.542445 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" path="/var/lib/kubelet/pods/d327f30c-13b0-4f3d-b62f-22a8ea77ac77/volumes" Apr 22 19:06:54.346788 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:54.346742 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:06:54.347320 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:54.347137 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" containerID="cri-o://5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8" gracePeriod=30 Apr 22 19:06:55.192102 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.192078 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:06:55.317817 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.317726 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-dshm\") pod \"f3c2c47f-0d54-44cb-b674-b366b08fd393\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " Apr 22 19:06:55.317817 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.317782 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc8b7\" (UniqueName: \"kubernetes.io/projected/f3c2c47f-0d54-44cb-b674-b366b08fd393-kube-api-access-gc8b7\") pod \"f3c2c47f-0d54-44cb-b674-b366b08fd393\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " Apr 22 19:06:55.317817 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.317799 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-kserve-provision-location\") pod \"f3c2c47f-0d54-44cb-b674-b366b08fd393\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " Apr 22 19:06:55.318114 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.317824 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c2c47f-0d54-44cb-b674-b366b08fd393-tls-certs\") pod \"f3c2c47f-0d54-44cb-b674-b366b08fd393\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " Apr 22 19:06:55.318114 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.317840 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-home\") pod \"f3c2c47f-0d54-44cb-b674-b366b08fd393\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " Apr 22 19:06:55.318114 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.317889 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-model-cache\") pod \"f3c2c47f-0d54-44cb-b674-b366b08fd393\" (UID: \"f3c2c47f-0d54-44cb-b674-b366b08fd393\") " Apr 22 19:06:55.318273 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.318228 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-model-cache" (OuterVolumeSpecName: "model-cache") pod "f3c2c47f-0d54-44cb-b674-b366b08fd393" (UID: "f3c2c47f-0d54-44cb-b674-b366b08fd393"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:55.318273 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.318241 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-home" (OuterVolumeSpecName: "home") pod "f3c2c47f-0d54-44cb-b674-b366b08fd393" (UID: "f3c2c47f-0d54-44cb-b674-b366b08fd393"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:55.319871 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.319849 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-dshm" (OuterVolumeSpecName: "dshm") pod "f3c2c47f-0d54-44cb-b674-b366b08fd393" (UID: "f3c2c47f-0d54-44cb-b674-b366b08fd393"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:55.320337 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.320311 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c2c47f-0d54-44cb-b674-b366b08fd393-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f3c2c47f-0d54-44cb-b674-b366b08fd393" (UID: "f3c2c47f-0d54-44cb-b674-b366b08fd393"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:55.320434 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.320333 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c2c47f-0d54-44cb-b674-b366b08fd393-kube-api-access-gc8b7" (OuterVolumeSpecName: "kube-api-access-gc8b7") pod "f3c2c47f-0d54-44cb-b674-b366b08fd393" (UID: "f3c2c47f-0d54-44cb-b674-b366b08fd393"). InnerVolumeSpecName "kube-api-access-gc8b7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:55.383372 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.383325 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f3c2c47f-0d54-44cb-b674-b366b08fd393" (UID: "f3c2c47f-0d54-44cb-b674-b366b08fd393"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:55.419206 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.419164 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:55.419206 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.419206 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:55.419206 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.419217 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gc8b7\" (UniqueName: \"kubernetes.io/projected/f3c2c47f-0d54-44cb-b674-b366b08fd393-kube-api-access-gc8b7\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:55.419443 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.419227 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:55.419443 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.419237 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c2c47f-0d54-44cb-b674-b366b08fd393-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:55.419443 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.419247 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f3c2c47f-0d54-44cb-b674-b366b08fd393-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:06:55.648693 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.648657 2578 generic.go:358] "Generic (PLEG): container finished" podID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerID="5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8" exitCode=0 Apr 22 19:06:55.648902 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.648731 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:06:55.648902 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.648723 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f3c2c47f-0d54-44cb-b674-b366b08fd393","Type":"ContainerDied","Data":"5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8"} Apr 22 19:06:55.648902 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.648774 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f3c2c47f-0d54-44cb-b674-b366b08fd393","Type":"ContainerDied","Data":"cb166c0b36e9ecfbbc29d9d53125f419b0f48be1141e9c1b002fe6a27f717e30"} Apr 22 19:06:55.648902 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.648795 2578 scope.go:117] "RemoveContainer" containerID="5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8" Apr 22 19:06:55.666548 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.666512 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:06:55.668559 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.668529 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:06:55.669347 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.669329 2578 scope.go:117] "RemoveContainer" containerID="7e505b3529882f39909d2a69f2078f159e5016def76b0a5ba96b403b814848e2" Apr 22 19:06:55.735716 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.735696 2578 scope.go:117] "RemoveContainer" containerID="5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8" Apr 22 19:06:55.736006 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:06:55.735985 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8\": container with ID starting with 5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8 not found: ID does not exist" containerID="5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8" Apr 22 19:06:55.736079 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.736015 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8"} err="failed to get container status \"5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8\": rpc error: code = NotFound desc = could not find container \"5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8\": container with ID starting with 5c5f45abc9867831b1ab1b5722098133efa2f30290993f5389d96711ce26fbc8 not found: ID does not exist" Apr 22 19:06:55.736079 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.736035 2578 scope.go:117] "RemoveContainer" containerID="7e505b3529882f39909d2a69f2078f159e5016def76b0a5ba96b403b814848e2" Apr 22 19:06:55.736291 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:06:55.736272 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e505b3529882f39909d2a69f2078f159e5016def76b0a5ba96b403b814848e2\": container with ID starting with 7e505b3529882f39909d2a69f2078f159e5016def76b0a5ba96b403b814848e2 not found: ID does not exist" containerID="7e505b3529882f39909d2a69f2078f159e5016def76b0a5ba96b403b814848e2" Apr 22 19:06:55.736329 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:55.736299 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e505b3529882f39909d2a69f2078f159e5016def76b0a5ba96b403b814848e2"} err="failed to get container status \"7e505b3529882f39909d2a69f2078f159e5016def76b0a5ba96b403b814848e2\": rpc error: code = NotFound desc = could not find container \"7e505b3529882f39909d2a69f2078f159e5016def76b0a5ba96b403b814848e2\": container with ID starting with 7e505b3529882f39909d2a69f2078f159e5016def76b0a5ba96b403b814848e2 not found: ID does not exist" Apr 22 19:06:57.542214 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:06:57.542179 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" path="/var/lib/kubelet/pods/f3c2c47f-0d54-44cb-b674-b366b08fd393/volumes" Apr 22 19:07:22.234480 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234429 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk"] Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234806 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="llm-d-routing-sidecar" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234820 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="llm-d-routing-sidecar" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234830 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="storage-initializer" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234836 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="storage-initializer" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234845 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234850 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234862 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234869 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234882 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="storage-initializer" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234887 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="storage-initializer" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234894 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="storage-initializer" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234899 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="storage-initializer" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234905 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234909 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234915 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234920 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234930 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="storage-initializer" Apr 22 19:07:22.234977 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234938 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="storage-initializer" Apr 22 19:07:22.235568 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.234997 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="llm-d-routing-sidecar" Apr 22 19:07:22.235568 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.235009 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d327f30c-13b0-4f3d-b62f-22a8ea77ac77" containerName="main" Apr 22 19:07:22.235568 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.235020 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ac1c96c-e418-41db-817d-45b6972b1f45" containerName="main" Apr 22 19:07:22.235568 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.235031 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3c2c47f-0d54-44cb-b674-b366b08fd393" containerName="main" Apr 22 19:07:22.235568 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.235038 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5c9c054-061b-421f-a220-c4f073642276" containerName="main" Apr 22 19:07:22.237968 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.237950 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.240191 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.240166 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 19:07:22.240370 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.240353 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 22 19:07:22.244428 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.244394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-home\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.244575 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.244480 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.245094 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.245069 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqss5\" (UniqueName: \"kubernetes.io/projected/f89f2d68-754a-41b4-85f9-fc71df7c37c5-kube-api-access-wqss5\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.245194 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.245178 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f89f2d68-754a-41b4-85f9-fc71df7c37c5-tls-certs\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.245257 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.245230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-model-cache\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.245326 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.245309 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-dshm\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.246757 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.246720 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk"] Apr 22 19:07:22.345988 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.345947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f89f2d68-754a-41b4-85f9-fc71df7c37c5-tls-certs\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.345988 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.345993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-model-cache\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.346266 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.346029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-dshm\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.346266 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.346049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-home\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.346266 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.346083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.346266 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.346112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqss5\" (UniqueName: \"kubernetes.io/projected/f89f2d68-754a-41b4-85f9-fc71df7c37c5-kube-api-access-wqss5\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.346571 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.346547 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.346636 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.346565 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-model-cache\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.346681 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.346637 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-home\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.348381 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.348358 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-dshm\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.348671 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.348655 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f89f2d68-754a-41b4-85f9-fc71df7c37c5-tls-certs\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.354215 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.354186 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqss5\" (UniqueName: \"kubernetes.io/projected/f89f2d68-754a-41b4-85f9-fc71df7c37c5-kube-api-access-wqss5\") pod \"scheduler-inline-config-test-kserve-564cb98467-2fmrk\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.548593 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.548489 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:22.696079 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.696042 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk"] Apr 22 19:07:22.703419 ip-10-0-132-151 kubenswrapper[2578]: W0422 19:07:22.703393 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf89f2d68_754a_41b4_85f9_fc71df7c37c5.slice/crio-6b1098fccfa22c37f899c087a02bab9d44db05a508832dadad5c3949b0d2f9b3 WatchSource:0}: Error finding container 6b1098fccfa22c37f899c087a02bab9d44db05a508832dadad5c3949b0d2f9b3: Status 404 returned error can't find the container with id 6b1098fccfa22c37f899c087a02bab9d44db05a508832dadad5c3949b0d2f9b3 Apr 22 19:07:22.705562 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.705542 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:07:22.735452 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:22.735421 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" event={"ID":"f89f2d68-754a-41b4-85f9-fc71df7c37c5","Type":"ContainerStarted","Data":"6b1098fccfa22c37f899c087a02bab9d44db05a508832dadad5c3949b0d2f9b3"} Apr 22 19:07:23.739876 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:23.739837 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" event={"ID":"f89f2d68-754a-41b4-85f9-fc71df7c37c5","Type":"ContainerStarted","Data":"50e08471cbb669f2748d44940cfd9ec02b00c32c3e765d597208f1e966268f9b"} Apr 22 19:07:27.755508 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:27.755447 2578 generic.go:358] "Generic (PLEG): container finished" podID="f89f2d68-754a-41b4-85f9-fc71df7c37c5" containerID="50e08471cbb669f2748d44940cfd9ec02b00c32c3e765d597208f1e966268f9b" exitCode=0 Apr 22 19:07:27.756200 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:27.755525 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" event={"ID":"f89f2d68-754a-41b4-85f9-fc71df7c37c5","Type":"ContainerDied","Data":"50e08471cbb669f2748d44940cfd9ec02b00c32c3e765d597208f1e966268f9b"} Apr 22 19:07:28.761394 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:28.761307 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" event={"ID":"f89f2d68-754a-41b4-85f9-fc71df7c37c5","Type":"ContainerStarted","Data":"78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639"} Apr 22 19:07:28.782533 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:28.782473 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" podStartSLOduration=6.782438903 podStartE2EDuration="6.782438903s" podCreationTimestamp="2026-04-22 19:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:07:28.780536716 +0000 UTC m=+1777.770803345" watchObservedRunningTime="2026-04-22 19:07:28.782438903 +0000 UTC m=+1777.772705589" Apr 22 19:07:32.549354 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:32.549286 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:32.549354 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:32.549355 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:32.562031 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:32.562003 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:32.786033 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:32.786002 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:51.556840 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:51.556812 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 19:07:51.564278 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:51.564256 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 19:07:55.579451 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.579350 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk"] Apr 22 19:07:55.579923 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.579765 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" podUID="f89f2d68-754a-41b4-85f9-fc71df7c37c5" containerName="main" containerID="cri-o://78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639" gracePeriod=30 Apr 22 19:07:55.851951 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.851923 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:55.852716 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.852693 2578 generic.go:358] "Generic (PLEG): container finished" podID="f89f2d68-754a-41b4-85f9-fc71df7c37c5" containerID="78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639" exitCode=0 Apr 22 19:07:55.852793 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.852768 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" event={"ID":"f89f2d68-754a-41b4-85f9-fc71df7c37c5","Type":"ContainerDied","Data":"78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639"} Apr 22 19:07:55.852833 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.852809 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" event={"ID":"f89f2d68-754a-41b4-85f9-fc71df7c37c5","Type":"ContainerDied","Data":"6b1098fccfa22c37f899c087a02bab9d44db05a508832dadad5c3949b0d2f9b3"} Apr 22 19:07:55.852833 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.852825 2578 scope.go:117] "RemoveContainer" containerID="78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639" Apr 22 19:07:55.860420 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.860401 2578 scope.go:117] "RemoveContainer" containerID="50e08471cbb669f2748d44940cfd9ec02b00c32c3e765d597208f1e966268f9b" Apr 22 19:07:55.931793 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.931772 2578 scope.go:117] "RemoveContainer" containerID="78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639" Apr 22 19:07:55.932139 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:07:55.932114 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639\": container with ID starting with 78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639 not found: ID does not exist" containerID="78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639" Apr 22 19:07:55.932194 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.932150 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639"} err="failed to get container status \"78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639\": rpc error: code = NotFound desc = could not find container \"78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639\": container with ID starting with 78af77da66f8d215ba9c69c7e5f9fe76211170771da294236432a21900242639 not found: ID does not exist" Apr 22 19:07:55.932194 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.932171 2578 scope.go:117] "RemoveContainer" containerID="50e08471cbb669f2748d44940cfd9ec02b00c32c3e765d597208f1e966268f9b" Apr 22 19:07:55.932394 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:07:55.932376 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e08471cbb669f2748d44940cfd9ec02b00c32c3e765d597208f1e966268f9b\": container with ID starting with 50e08471cbb669f2748d44940cfd9ec02b00c32c3e765d597208f1e966268f9b not found: ID does not exist" containerID="50e08471cbb669f2748d44940cfd9ec02b00c32c3e765d597208f1e966268f9b" Apr 22 19:07:55.932472 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.932407 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e08471cbb669f2748d44940cfd9ec02b00c32c3e765d597208f1e966268f9b"} err="failed to get container status \"50e08471cbb669f2748d44940cfd9ec02b00c32c3e765d597208f1e966268f9b\": rpc error: code = NotFound desc = could not find container \"50e08471cbb669f2748d44940cfd9ec02b00c32c3e765d597208f1e966268f9b\": container with ID starting with 50e08471cbb669f2748d44940cfd9ec02b00c32c3e765d597208f1e966268f9b not found: ID does not exist" Apr 22 19:07:55.937700 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.937678 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-kserve-provision-location\") pod \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " Apr 22 19:07:55.937788 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.937710 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-model-cache\") pod \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " Apr 22 19:07:55.937788 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.937735 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-dshm\") pod \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " Apr 22 19:07:55.937788 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.937761 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqss5\" (UniqueName: \"kubernetes.io/projected/f89f2d68-754a-41b4-85f9-fc71df7c37c5-kube-api-access-wqss5\") pod \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " Apr 22 19:07:55.937919 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.937806 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-home\") pod \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " Apr 22 19:07:55.937919 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.937847 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f89f2d68-754a-41b4-85f9-fc71df7c37c5-tls-certs\") pod \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\" (UID: \"f89f2d68-754a-41b4-85f9-fc71df7c37c5\") " Apr 22 19:07:55.938028 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.937989 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-model-cache" (OuterVolumeSpecName: "model-cache") pod "f89f2d68-754a-41b4-85f9-fc71df7c37c5" (UID: "f89f2d68-754a-41b4-85f9-fc71df7c37c5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:55.938123 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.938093 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-home" (OuterVolumeSpecName: "home") pod "f89f2d68-754a-41b4-85f9-fc71df7c37c5" (UID: "f89f2d68-754a-41b4-85f9-fc71df7c37c5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:55.938185 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.938125 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-model-cache\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:07:55.939939 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.939915 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-dshm" (OuterVolumeSpecName: "dshm") pod "f89f2d68-754a-41b4-85f9-fc71df7c37c5" (UID: "f89f2d68-754a-41b4-85f9-fc71df7c37c5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:55.940042 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.939947 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f89f2d68-754a-41b4-85f9-fc71df7c37c5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f89f2d68-754a-41b4-85f9-fc71df7c37c5" (UID: "f89f2d68-754a-41b4-85f9-fc71df7c37c5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:07:55.940042 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.939964 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89f2d68-754a-41b4-85f9-fc71df7c37c5-kube-api-access-wqss5" (OuterVolumeSpecName: "kube-api-access-wqss5") pod "f89f2d68-754a-41b4-85f9-fc71df7c37c5" (UID: "f89f2d68-754a-41b4-85f9-fc71df7c37c5"). InnerVolumeSpecName "kube-api-access-wqss5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:07:55.999490 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:55.999428 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f89f2d68-754a-41b4-85f9-fc71df7c37c5" (UID: "f89f2d68-754a-41b4-85f9-fc71df7c37c5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:56.038723 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:56.038684 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wqss5\" (UniqueName: \"kubernetes.io/projected/f89f2d68-754a-41b4-85f9-fc71df7c37c5-kube-api-access-wqss5\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:07:56.038723 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:56.038718 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-home\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:07:56.038723 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:56.038731 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f89f2d68-754a-41b4-85f9-fc71df7c37c5-tls-certs\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:07:56.038949 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:56.038741 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-kserve-provision-location\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:07:56.038949 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:56.038749 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f89f2d68-754a-41b4-85f9-fc71df7c37c5-dshm\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:07:56.857657 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:56.857628 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk" Apr 22 19:07:56.879301 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:56.879269 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk"] Apr 22 19:07:56.884037 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:56.884006 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-564cb98467-2fmrk"] Apr 22 19:07:57.539951 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:07:57.539914 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89f2d68-754a-41b4-85f9-fc71df7c37c5" path="/var/lib/kubelet/pods/f89f2d68-754a-41b4-85f9-fc71df7c37c5/volumes" Apr 22 19:08:59.015253 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.015216 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8b8rg/must-gather-zqmxc"] Apr 22 19:08:59.015706 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.015582 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f89f2d68-754a-41b4-85f9-fc71df7c37c5" containerName="main" Apr 22 19:08:59.015706 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.015598 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89f2d68-754a-41b4-85f9-fc71df7c37c5" containerName="main" Apr 22 19:08:59.015706 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.015608 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f89f2d68-754a-41b4-85f9-fc71df7c37c5" containerName="storage-initializer" Apr 22 19:08:59.015706 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.015615 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89f2d68-754a-41b4-85f9-fc71df7c37c5" containerName="storage-initializer" Apr 22 19:08:59.015706 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.015668 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f89f2d68-754a-41b4-85f9-fc71df7c37c5" containerName="main" Apr 22 19:08:59.018944 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.018926 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" Apr 22 19:08:59.021695 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.021670 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8b8rg\"/\"default-dockercfg-pmshx\"" Apr 22 19:08:59.022630 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.022609 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8b8rg\"/\"openshift-service-ca.crt\"" Apr 22 19:08:59.022630 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.022624 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8b8rg\"/\"kube-root-ca.crt\"" Apr 22 19:08:59.027734 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.027703 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8b8rg/must-gather-zqmxc"] Apr 22 19:08:59.137668 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.137625 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/94201f48-e182-41d5-be25-3bfeb20722bd-must-gather-output\") pod \"must-gather-zqmxc\" (UID: \"94201f48-e182-41d5-be25-3bfeb20722bd\") " pod="openshift-must-gather-8b8rg/must-gather-zqmxc" Apr 22 19:08:59.137870 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.137680 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84q7q\" (UniqueName: \"kubernetes.io/projected/94201f48-e182-41d5-be25-3bfeb20722bd-kube-api-access-84q7q\") pod \"must-gather-zqmxc\" (UID: \"94201f48-e182-41d5-be25-3bfeb20722bd\") " pod="openshift-must-gather-8b8rg/must-gather-zqmxc" Apr 22 19:08:59.238982 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.238934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84q7q\" (UniqueName: \"kubernetes.io/projected/94201f48-e182-41d5-be25-3bfeb20722bd-kube-api-access-84q7q\") pod \"must-gather-zqmxc\" (UID: \"94201f48-e182-41d5-be25-3bfeb20722bd\") " pod="openshift-must-gather-8b8rg/must-gather-zqmxc" Apr 22 19:08:59.239172 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.239028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/94201f48-e182-41d5-be25-3bfeb20722bd-must-gather-output\") pod \"must-gather-zqmxc\" (UID: \"94201f48-e182-41d5-be25-3bfeb20722bd\") " pod="openshift-must-gather-8b8rg/must-gather-zqmxc" Apr 22 19:08:59.239383 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.239362 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/94201f48-e182-41d5-be25-3bfeb20722bd-must-gather-output\") pod \"must-gather-zqmxc\" (UID: \"94201f48-e182-41d5-be25-3bfeb20722bd\") " pod="openshift-must-gather-8b8rg/must-gather-zqmxc" Apr 22 19:08:59.247635 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.247610 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84q7q\" (UniqueName: \"kubernetes.io/projected/94201f48-e182-41d5-be25-3bfeb20722bd-kube-api-access-84q7q\") pod \"must-gather-zqmxc\" (UID: \"94201f48-e182-41d5-be25-3bfeb20722bd\") " pod="openshift-must-gather-8b8rg/must-gather-zqmxc" Apr 22 19:08:59.328828 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.328733 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" Apr 22 19:08:59.538944 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:08:59.538909 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8b8rg/must-gather-zqmxc"] Apr 22 19:08:59.545851 ip-10-0-132-151 kubenswrapper[2578]: W0422 19:08:59.545815 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94201f48_e182_41d5_be25_3bfeb20722bd.slice/crio-4d9864f48eea652e86d7c8bf2d1527ccfda1a66112bc19f845cca2e6fac10301 WatchSource:0}: Error finding container 4d9864f48eea652e86d7c8bf2d1527ccfda1a66112bc19f845cca2e6fac10301: Status 404 returned error can't find the container with id 4d9864f48eea652e86d7c8bf2d1527ccfda1a66112bc19f845cca2e6fac10301 Apr 22 19:09:00.070702 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:00.070659 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" event={"ID":"94201f48-e182-41d5-be25-3bfeb20722bd","Type":"ContainerStarted","Data":"4d9864f48eea652e86d7c8bf2d1527ccfda1a66112bc19f845cca2e6fac10301"} Apr 22 19:09:05.092867 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:05.092827 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" event={"ID":"94201f48-e182-41d5-be25-3bfeb20722bd","Type":"ContainerStarted","Data":"c5c0f250d89ceb1e5af80b0e640a730b7616cc7ddbc16424cd2a85ca56165763"} Apr 22 19:09:05.093270 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:05.092875 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" event={"ID":"94201f48-e182-41d5-be25-3bfeb20722bd","Type":"ContainerStarted","Data":"e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867"} Apr 22 19:09:05.111181 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:05.111126 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" podStartSLOduration=1.65974558 podStartE2EDuration="6.111107949s" podCreationTimestamp="2026-04-22 19:08:59 +0000 UTC" firstStartedPulling="2026-04-22 19:08:59.547473408 +0000 UTC m=+1868.537740027" lastFinishedPulling="2026-04-22 19:09:03.998835772 +0000 UTC m=+1872.989102396" observedRunningTime="2026-04-22 19:09:05.109495285 +0000 UTC m=+1874.099761915" watchObservedRunningTime="2026-04-22 19:09:05.111107949 +0000 UTC m=+1874.101374578" Apr 22 19:09:13.693129 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:13.693044 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:14.785162 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:14.785134 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:15.871140 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:15.871107 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:16.901216 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:16.901177 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:17.932572 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:17.932527 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:19.022154 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:19.022119 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:20.066839 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:20.066799 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:21.099173 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:21.099136 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:22.138564 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:22.138527 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:23.186884 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:23.186813 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:24.366194 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:24.366161 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:25.426397 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:25.426365 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:26.452741 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:26.452709 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:27.458688 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:27.458654 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-fxmjr_a51f3ffd-8fcd-4925-8919-0586473712a1/istio-proxy/0.log" Apr 22 19:09:30.205842 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:30.205805 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-sqm4w_425c87f5-c0cb-48be-9762-4f5d43de58f6/authorino/0.log" Apr 22 19:09:30.277042 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:30.277015 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-ddglq_6e94feca-ca43-412c-b1b4-23c6c48170b7/kuadrant-console-plugin/0.log" Apr 22 19:09:31.188557 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:31.188520 2578 generic.go:358] "Generic (PLEG): container finished" podID="94201f48-e182-41d5-be25-3bfeb20722bd" containerID="e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867" exitCode=0 Apr 22 19:09:31.188774 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:31.188597 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" event={"ID":"94201f48-e182-41d5-be25-3bfeb20722bd","Type":"ContainerDied","Data":"e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867"} Apr 22 19:09:31.188954 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:31.188940 2578 scope.go:117] "RemoveContainer" containerID="e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867" Apr 22 19:09:32.038083 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.038053 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8b8rg_must-gather-zqmxc_94201f48-e182-41d5-be25-3bfeb20722bd/gather/0.log" Apr 22 19:09:32.707652 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.707614 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nmkll/must-gather-q7xl8"] Apr 22 19:09:32.712292 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.712273 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nmkll/must-gather-q7xl8" Apr 22 19:09:32.724555 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.724520 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nmkll\"/\"kube-root-ca.crt\"" Apr 22 19:09:32.724735 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.724707 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nmkll\"/\"openshift-service-ca.crt\"" Apr 22 19:09:32.725541 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.725517 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nmkll\"/\"default-dockercfg-vxhkm\"" Apr 22 19:09:32.739925 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.739892 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nmkll/must-gather-q7xl8"] Apr 22 19:09:32.745698 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.745667 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ad0588e-bede-4022-aa70-a1aa6e0d0035-must-gather-output\") pod \"must-gather-q7xl8\" (UID: \"9ad0588e-bede-4022-aa70-a1aa6e0d0035\") " pod="openshift-must-gather-nmkll/must-gather-q7xl8" Apr 22 19:09:32.745831 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.745734 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z29sj\" (UniqueName: \"kubernetes.io/projected/9ad0588e-bede-4022-aa70-a1aa6e0d0035-kube-api-access-z29sj\") pod \"must-gather-q7xl8\" (UID: \"9ad0588e-bede-4022-aa70-a1aa6e0d0035\") " pod="openshift-must-gather-nmkll/must-gather-q7xl8" Apr 22 19:09:32.846136 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.846093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ad0588e-bede-4022-aa70-a1aa6e0d0035-must-gather-output\") pod \"must-gather-q7xl8\" (UID: \"9ad0588e-bede-4022-aa70-a1aa6e0d0035\") " pod="openshift-must-gather-nmkll/must-gather-q7xl8" Apr 22 19:09:32.846342 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.846148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z29sj\" (UniqueName: \"kubernetes.io/projected/9ad0588e-bede-4022-aa70-a1aa6e0d0035-kube-api-access-z29sj\") pod \"must-gather-q7xl8\" (UID: \"9ad0588e-bede-4022-aa70-a1aa6e0d0035\") " pod="openshift-must-gather-nmkll/must-gather-q7xl8" Apr 22 19:09:32.846553 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.846531 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ad0588e-bede-4022-aa70-a1aa6e0d0035-must-gather-output\") pod \"must-gather-q7xl8\" (UID: \"9ad0588e-bede-4022-aa70-a1aa6e0d0035\") " pod="openshift-must-gather-nmkll/must-gather-q7xl8" Apr 22 19:09:32.855119 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:32.855096 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z29sj\" (UniqueName: \"kubernetes.io/projected/9ad0588e-bede-4022-aa70-a1aa6e0d0035-kube-api-access-z29sj\") pod \"must-gather-q7xl8\" (UID: \"9ad0588e-bede-4022-aa70-a1aa6e0d0035\") " pod="openshift-must-gather-nmkll/must-gather-q7xl8" Apr 22 19:09:33.026351 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:33.026255 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nmkll/must-gather-q7xl8" Apr 22 19:09:33.292972 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:33.292885 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nmkll/must-gather-q7xl8"] Apr 22 19:09:33.672412 ip-10-0-132-151 kubenswrapper[2578]: W0422 19:09:33.672375 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ad0588e_bede_4022_aa70_a1aa6e0d0035.slice/crio-a741cf3b7ff089687db2031aa2232b8a5289872351c85952a6320aae0534dfa3 WatchSource:0}: Error finding container a741cf3b7ff089687db2031aa2232b8a5289872351c85952a6320aae0534dfa3: Status 404 returned error can't find the container with id a741cf3b7ff089687db2031aa2232b8a5289872351c85952a6320aae0534dfa3 Apr 22 19:09:34.198588 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:34.198545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nmkll/must-gather-q7xl8" event={"ID":"9ad0588e-bede-4022-aa70-a1aa6e0d0035","Type":"ContainerStarted","Data":"a741cf3b7ff089687db2031aa2232b8a5289872351c85952a6320aae0534dfa3"} Apr 22 19:09:35.205571 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:35.204999 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nmkll/must-gather-q7xl8" event={"ID":"9ad0588e-bede-4022-aa70-a1aa6e0d0035","Type":"ContainerStarted","Data":"5b9e87e07e0f96c7ae8fba3594533d151ed0344e456b71bb3bd4d4f7a0e9d234"} Apr 22 19:09:35.205571 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:35.205044 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nmkll/must-gather-q7xl8" event={"ID":"9ad0588e-bede-4022-aa70-a1aa6e0d0035","Type":"ContainerStarted","Data":"e9f9bfab4467939954c2c283e108f0606a6d1e5dc14cece6a43cd8acd704beb1"} Apr 22 19:09:35.223879 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:35.223822 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nmkll/must-gather-q7xl8" podStartSLOduration=2.355888727 podStartE2EDuration="3.223804282s" podCreationTimestamp="2026-04-22 19:09:32 +0000 UTC" firstStartedPulling="2026-04-22 19:09:33.674078356 +0000 UTC m=+1902.664344967" lastFinishedPulling="2026-04-22 19:09:34.541993911 +0000 UTC m=+1903.532260522" observedRunningTime="2026-04-22 19:09:35.220905706 +0000 UTC m=+1904.211172336" watchObservedRunningTime="2026-04-22 19:09:35.223804282 +0000 UTC m=+1904.214070988" Apr 22 19:09:36.108272 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:36.108225 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9hghn_cbd68ab5-95f9-489c-a958-928dc293c79a/global-pull-secret-syncer/0.log" Apr 22 19:09:36.205751 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:36.205720 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-gjdjf_940a66b8-963f-4a92-adee-fd47c60355d9/konnectivity-agent/0.log" Apr 22 19:09:36.312528 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:36.312490 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-151.ec2.internal_0786cfcb4c5f79e61aa898a7fed986dc/haproxy/0.log" Apr 22 19:09:37.524343 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.524296 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8b8rg/must-gather-zqmxc"] Apr 22 19:09:37.524891 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.524609 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" podUID="94201f48-e182-41d5-be25-3bfeb20722bd" containerName="copy" containerID="cri-o://c5c0f250d89ceb1e5af80b0e640a730b7616cc7ddbc16424cd2a85ca56165763" gracePeriod=2 Apr 22 19:09:37.532901 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.532827 2578 status_manager.go:895] "Failed to get status for pod" podUID="94201f48-e182-41d5-be25-3bfeb20722bd" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" err="pods \"must-gather-zqmxc\" is forbidden: User \"system:node:ip-10-0-132-151.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8b8rg\": no relationship found between node 'ip-10-0-132-151.ec2.internal' and this object" Apr 22 19:09:37.545033 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.544643 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8b8rg/must-gather-zqmxc"] Apr 22 19:09:37.863799 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.863770 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8b8rg_must-gather-zqmxc_94201f48-e182-41d5-be25-3bfeb20722bd/copy/0.log" Apr 22 19:09:37.864226 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.864204 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" Apr 22 19:09:37.866479 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.866430 2578 status_manager.go:895] "Failed to get status for pod" podUID="94201f48-e182-41d5-be25-3bfeb20722bd" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" err="pods \"must-gather-zqmxc\" is forbidden: User \"system:node:ip-10-0-132-151.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8b8rg\": no relationship found between node 'ip-10-0-132-151.ec2.internal' and this object" Apr 22 19:09:37.889488 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.888792 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84q7q\" (UniqueName: \"kubernetes.io/projected/94201f48-e182-41d5-be25-3bfeb20722bd-kube-api-access-84q7q\") pod \"94201f48-e182-41d5-be25-3bfeb20722bd\" (UID: \"94201f48-e182-41d5-be25-3bfeb20722bd\") " Apr 22 19:09:37.889488 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.888864 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/94201f48-e182-41d5-be25-3bfeb20722bd-must-gather-output\") pod \"94201f48-e182-41d5-be25-3bfeb20722bd\" (UID: \"94201f48-e182-41d5-be25-3bfeb20722bd\") " Apr 22 19:09:37.903485 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.901999 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94201f48-e182-41d5-be25-3bfeb20722bd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "94201f48-e182-41d5-be25-3bfeb20722bd" (UID: "94201f48-e182-41d5-be25-3bfeb20722bd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:37.909025 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.908718 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94201f48-e182-41d5-be25-3bfeb20722bd-kube-api-access-84q7q" (OuterVolumeSpecName: "kube-api-access-84q7q") pod "94201f48-e182-41d5-be25-3bfeb20722bd" (UID: "94201f48-e182-41d5-be25-3bfeb20722bd"). InnerVolumeSpecName "kube-api-access-84q7q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:09:37.990048 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.990001 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-84q7q\" (UniqueName: \"kubernetes.io/projected/94201f48-e182-41d5-be25-3bfeb20722bd-kube-api-access-84q7q\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:09:37.990048 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:37.990051 2578 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/94201f48-e182-41d5-be25-3bfeb20722bd-must-gather-output\") on node \"ip-10-0-132-151.ec2.internal\" DevicePath \"\"" Apr 22 19:09:38.219516 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:38.219481 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8b8rg_must-gather-zqmxc_94201f48-e182-41d5-be25-3bfeb20722bd/copy/0.log" Apr 22 19:09:38.219895 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:38.219864 2578 generic.go:358] "Generic (PLEG): container finished" podID="94201f48-e182-41d5-be25-3bfeb20722bd" containerID="c5c0f250d89ceb1e5af80b0e640a730b7616cc7ddbc16424cd2a85ca56165763" exitCode=143 Apr 22 19:09:38.220022 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:38.219929 2578 scope.go:117] "RemoveContainer" containerID="c5c0f250d89ceb1e5af80b0e640a730b7616cc7ddbc16424cd2a85ca56165763" Apr 22 19:09:38.220085 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:38.220068 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" Apr 22 19:09:38.234793 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:38.229734 2578 status_manager.go:895] "Failed to get status for pod" podUID="94201f48-e182-41d5-be25-3bfeb20722bd" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" err="pods \"must-gather-zqmxc\" is forbidden: User \"system:node:ip-10-0-132-151.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8b8rg\": no relationship found between node 'ip-10-0-132-151.ec2.internal' and this object" Apr 22 19:09:38.260527 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:38.258794 2578 scope.go:117] "RemoveContainer" containerID="e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867" Apr 22 19:09:38.263536 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:38.262697 2578 status_manager.go:895] "Failed to get status for pod" podUID="94201f48-e182-41d5-be25-3bfeb20722bd" pod="openshift-must-gather-8b8rg/must-gather-zqmxc" err="pods \"must-gather-zqmxc\" is forbidden: User \"system:node:ip-10-0-132-151.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8b8rg\": no relationship found between node 'ip-10-0-132-151.ec2.internal' and this object" Apr 22 19:09:38.302668 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:38.298543 2578 scope.go:117] "RemoveContainer" containerID="c5c0f250d89ceb1e5af80b0e640a730b7616cc7ddbc16424cd2a85ca56165763" Apr 22 19:09:38.307485 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:09:38.303613 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5c0f250d89ceb1e5af80b0e640a730b7616cc7ddbc16424cd2a85ca56165763\": container with ID starting with c5c0f250d89ceb1e5af80b0e640a730b7616cc7ddbc16424cd2a85ca56165763 not found: ID does not exist" containerID="c5c0f250d89ceb1e5af80b0e640a730b7616cc7ddbc16424cd2a85ca56165763" Apr 22 19:09:38.307485 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:38.303666 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5c0f250d89ceb1e5af80b0e640a730b7616cc7ddbc16424cd2a85ca56165763"} err="failed to get container status \"c5c0f250d89ceb1e5af80b0e640a730b7616cc7ddbc16424cd2a85ca56165763\": rpc error: code = NotFound desc = could not find container \"c5c0f250d89ceb1e5af80b0e640a730b7616cc7ddbc16424cd2a85ca56165763\": container with ID starting with c5c0f250d89ceb1e5af80b0e640a730b7616cc7ddbc16424cd2a85ca56165763 not found: ID does not exist" Apr 22 19:09:38.307485 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:38.303696 2578 scope.go:117] "RemoveContainer" containerID="e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867" Apr 22 19:09:38.311284 ip-10-0-132-151 kubenswrapper[2578]: E0422 19:09:38.309594 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867\": container with ID starting with e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867 not found: ID does not exist" containerID="e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867" Apr 22 19:09:38.311284 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:38.309640 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867"} err="failed to get container status \"e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867\": rpc error: code = NotFound desc = could not find container \"e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867\": container with ID starting with e3e9c5a7664b3c3b316128c8297690d766ed428bc83bce0d312b6dd41b0ed867 not found: ID does not exist" Apr 22 19:09:39.542351 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:39.542311 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94201f48-e182-41d5-be25-3bfeb20722bd" path="/var/lib/kubelet/pods/94201f48-e182-41d5-be25-3bfeb20722bd/volumes" Apr 22 19:09:40.396884 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:40.396842 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-sqm4w_425c87f5-c0cb-48be-9762-4f5d43de58f6/authorino/0.log" Apr 22 19:09:40.481820 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:40.481752 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-ddglq_6e94feca-ca43-412c-b1b4-23c6c48170b7/kuadrant-console-plugin/0.log" Apr 22 19:09:41.751602 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:41.751567 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-pc6pk_2812ef50-c4e4-42a2-940f-01dd5bf968d4/cluster-monitoring-operator/0.log" Apr 22 19:09:42.083296 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:42.083198 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w87gw_94e10a47-4857-4cff-9fbb-fbf8f76c8ae6/node-exporter/0.log" Apr 22 19:09:42.106682 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:42.106653 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w87gw_94e10a47-4857-4cff-9fbb-fbf8f76c8ae6/kube-rbac-proxy/0.log" Apr 22 19:09:42.130951 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:42.130925 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w87gw_94e10a47-4857-4cff-9fbb-fbf8f76c8ae6/init-textfile/0.log" Apr 22 19:09:42.646486 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:42.646432 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f8f9b78fb-vdqt8_197c7922-b84c-45ba-a3ee-a9b7ca6e75d5/thanos-query/0.log" Apr 22 19:09:42.678411 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:42.678374 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f8f9b78fb-vdqt8_197c7922-b84c-45ba-a3ee-a9b7ca6e75d5/kube-rbac-proxy-web/0.log" Apr 22 19:09:42.701083 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:42.701048 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f8f9b78fb-vdqt8_197c7922-b84c-45ba-a3ee-a9b7ca6e75d5/kube-rbac-proxy/0.log" Apr 22 19:09:42.729826 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:42.729789 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f8f9b78fb-vdqt8_197c7922-b84c-45ba-a3ee-a9b7ca6e75d5/prom-label-proxy/0.log" Apr 22 19:09:42.750079 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:42.750030 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f8f9b78fb-vdqt8_197c7922-b84c-45ba-a3ee-a9b7ca6e75d5/kube-rbac-proxy-rules/0.log" Apr 22 19:09:42.774915 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:42.774878 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f8f9b78fb-vdqt8_197c7922-b84c-45ba-a3ee-a9b7ca6e75d5/kube-rbac-proxy-metrics/0.log" Apr 22 19:09:45.367674 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.367631 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj"] Apr 22 19:09:45.368274 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.368100 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94201f48-e182-41d5-be25-3bfeb20722bd" containerName="copy" Apr 22 19:09:45.368274 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.368120 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="94201f48-e182-41d5-be25-3bfeb20722bd" containerName="copy" Apr 22 19:09:45.368274 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.368149 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94201f48-e182-41d5-be25-3bfeb20722bd" containerName="gather" Apr 22 19:09:45.368274 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.368158 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="94201f48-e182-41d5-be25-3bfeb20722bd" containerName="gather" Apr 22 19:09:45.368274 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.368238 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="94201f48-e182-41d5-be25-3bfeb20722bd" containerName="copy" Apr 22 19:09:45.368274 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.368251 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="94201f48-e182-41d5-be25-3bfeb20722bd" containerName="gather" Apr 22 19:09:45.371886 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.371858 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.381440 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.381407 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj"] Apr 22 19:09:45.470342 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.470296 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-proc\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.470579 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.470397 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-sys\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.470579 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.470491 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj22m\" (UniqueName: \"kubernetes.io/projected/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-kube-api-access-mj22m\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.470579 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.470566 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-lib-modules\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.470705 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.470628 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-podres\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.571288 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.571252 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mj22m\" (UniqueName: \"kubernetes.io/projected/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-kube-api-access-mj22m\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.571499 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.571301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-lib-modules\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.571499 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.571331 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-podres\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.571499 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.571434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-podres\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.571663 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.571498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-lib-modules\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.571663 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.571497 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-proc\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.571663 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.571556 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-proc\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.571663 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.571573 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-sys\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.571795 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.571661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-sys\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.580740 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.580673 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj22m\" (UniqueName: \"kubernetes.io/projected/79844f03-712d-4a41-ad2e-2a7ef87d6c4c-kube-api-access-mj22m\") pod \"perf-node-gather-daemonset-mspmj\" (UID: \"79844f03-712d-4a41-ad2e-2a7ef87d6c4c\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:45.657168 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.657139 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-qtbjf_76c4e788-057e-424f-9bb3-f21537fa2489/volume-data-source-validator/0.log" Apr 22 19:09:45.685372 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:45.685329 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:46.259994 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:46.259962 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj"] Apr 22 19:09:46.291147 ip-10-0-132-151 kubenswrapper[2578]: W0422 19:09:46.291101 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod79844f03_712d_4a41_ad2e_2a7ef87d6c4c.slice/crio-9083e6f7ba55588cc90341f508d34f4d6d0ee5b70f847d70070dc3fb5627da26 WatchSource:0}: Error finding container 9083e6f7ba55588cc90341f508d34f4d6d0ee5b70f847d70070dc3fb5627da26: Status 404 returned error can't find the container with id 9083e6f7ba55588cc90341f508d34f4d6d0ee5b70f847d70070dc3fb5627da26 Apr 22 19:09:46.519208 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:46.519128 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sv8gg_eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4/dns/0.log" Apr 22 19:09:46.541007 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:46.540976 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sv8gg_eb4a471c-0b73-4b4f-9dcf-8c9a29e743e4/kube-rbac-proxy/0.log" Apr 22 19:09:46.614527 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:46.614495 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-htng9_b0d1706a-8c90-4a85-abfe-07cf53827364/dns-node-resolver/0.log" Apr 22 19:09:47.118352 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:47.118314 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8vfmp_5cac15d4-0328-41c3-8bbc-e6d020fb09d2/node-ca/0.log" Apr 22 19:09:47.264375 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:47.264332 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" event={"ID":"79844f03-712d-4a41-ad2e-2a7ef87d6c4c","Type":"ContainerStarted","Data":"039eaf0b3a8c9639f29dcd00e9a1ccb7362f4107ff6d34037d6fecd2d1955023"} Apr 22 19:09:47.264375 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:47.264370 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" event={"ID":"79844f03-712d-4a41-ad2e-2a7ef87d6c4c","Type":"ContainerStarted","Data":"9083e6f7ba55588cc90341f508d34f4d6d0ee5b70f847d70070dc3fb5627da26"} Apr 22 19:09:47.264618 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:47.264491 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:47.281235 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:47.281179 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" podStartSLOduration=2.2811624090000002 podStartE2EDuration="2.281162409s" podCreationTimestamp="2026-04-22 19:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:09:47.2788955 +0000 UTC m=+1916.269162133" watchObservedRunningTime="2026-04-22 19:09:47.281162409 +0000 UTC m=+1916.271429038" Apr 22 19:09:48.564753 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:48.564722 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xpp6f_c91e57b3-5752-444f-9b93-89af6a4673a4/serve-healthcheck-canary/0.log" Apr 22 19:09:49.184622 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:49.184590 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v8tzj_d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae/kube-rbac-proxy/0.log" Apr 22 19:09:49.206122 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:49.206084 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v8tzj_d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae/exporter/0.log" Apr 22 19:09:49.228403 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:49.228366 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v8tzj_d51fb0f2-c7fa-4df1-bc32-d4d93320f4ae/extractor/0.log" Apr 22 19:09:51.813601 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:51.813567 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-59b94d4c58-q8sqm_f2c5a268-5d72-4c28-a3bc-043441856d64/manager/0.log" Apr 22 19:09:52.445345 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:52.445320 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-hj6k6_1a7017da-d849-49c5-a9f6-fe9997ffa457/server/0.log" Apr 22 19:09:52.819970 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:52.819883 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-t972h_f57a8eb1-cc41-45bb-8108-8ddf1a2d5229/s3-init/0.log" Apr 22 19:09:53.279146 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:53.279116 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-mspmj" Apr 22 19:09:57.635980 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:57.635849 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-qrql8_e9303b48-0205-46d7-b7c1-bef8c52ae1c2/migrator/0.log" Apr 22 19:09:57.659079 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:57.659041 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-qrql8_e9303b48-0205-46d7-b7c1-bef8c52ae1c2/graceful-termination/0.log" Apr 22 19:09:59.063924 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:59.063893 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-772rv_f748f72b-69ca-4686-b516-08b6a1e0e7a1/kube-multus-additional-cni-plugins/0.log" Apr 22 19:09:59.093595 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:59.093566 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-772rv_f748f72b-69ca-4686-b516-08b6a1e0e7a1/egress-router-binary-copy/0.log" Apr 22 19:09:59.125848 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:59.125814 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-772rv_f748f72b-69ca-4686-b516-08b6a1e0e7a1/cni-plugins/0.log" Apr 22 19:09:59.148963 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:59.148936 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-772rv_f748f72b-69ca-4686-b516-08b6a1e0e7a1/bond-cni-plugin/0.log" Apr 22 19:09:59.171874 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:59.171848 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-772rv_f748f72b-69ca-4686-b516-08b6a1e0e7a1/routeoverride-cni/0.log" Apr 22 19:09:59.205716 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:59.205682 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-772rv_f748f72b-69ca-4686-b516-08b6a1e0e7a1/whereabouts-cni-bincopy/0.log" Apr 22 19:09:59.233558 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:59.233519 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-772rv_f748f72b-69ca-4686-b516-08b6a1e0e7a1/whereabouts-cni/0.log" Apr 22 19:09:59.621298 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:59.621259 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kh2qh_96f11a65-b703-4a16-911b-e932a7067c05/kube-multus/0.log" Apr 22 19:09:59.726776 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:59.726740 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bkhdr_9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f/network-metrics-daemon/0.log" Apr 22 19:09:59.747737 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:09:59.747708 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bkhdr_9e2c690d-c0a7-4b29-bb43-9d60bcb7da7f/kube-rbac-proxy/0.log" Apr 22 19:10:00.945714 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:10:00.945680 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-controller/0.log" Apr 22 19:10:00.964077 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:10:00.964040 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/0.log" Apr 22 19:10:00.973758 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:10:00.973732 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovn-acl-logging/1.log" Apr 22 19:10:00.996131 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:10:00.996101 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/kube-rbac-proxy-node/0.log" Apr 22 19:10:01.018895 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:10:01.018852 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:10:01.042928 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:10:01.042902 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/northd/0.log" Apr 22 19:10:01.064714 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:10:01.064683 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/nbdb/0.log" Apr 22 19:10:01.086064 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:10:01.086032 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/sbdb/0.log" Apr 22 19:10:01.201448 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:10:01.201366 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2rx9_14a4227e-0dba-42ae-a250-c23eb012b836/ovnkube-controller/0.log" Apr 22 19:10:02.593847 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:10:02.593819 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-bw86q_6855095a-2036-4c98-8d4f-2bea5b4d8cd6/check-endpoints/0.log" Apr 22 19:10:02.657192 ip-10-0-132-151 kubenswrapper[2578]: I0422 19:10:02.657153 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-fqfx5_33339752-7b6a-4821-a16d-b079144b080d/network-check-target-container/0.log"