Apr 16 20:11:39.348229 ip-10-0-141-145 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:11:39.803666 ip-10-0-141-145 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:39.803666 ip-10-0-141-145 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:11:39.803666 ip-10-0-141-145 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:39.803666 ip-10-0-141-145 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:11:39.803666 ip-10-0-141-145 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:39.804439 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.804370 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:11:39.806655 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806640 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:39.806655 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806655 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806659 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806663 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806666 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806669 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806672 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806674 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806677 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806680 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806682 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806685 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806687 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806690 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806693 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806696 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806698 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806701 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806703 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806706 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806708 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:39.806718 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806711 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806713 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806718 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806721 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806725 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806728 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806731 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806734 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806737 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806740 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806743 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806746 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806749 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806752 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806754 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806757 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806759 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806762 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806767 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:39.807193 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806771 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806773 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806776 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806778 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806781 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806783 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806786 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806788 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806791 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806800 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806803 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806805 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806807 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806810 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806812 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806816 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806819 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806821 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806824 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:39.807673 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806826 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806829 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806832 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806834 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806837 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806839 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806842 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806844 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806846 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806849 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806852 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806854 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806857 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806859 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806862 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806864 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806867 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806869 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806872 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806874 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:39.808140 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806877 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806880 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806883 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806885 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806887 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806890 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.806893 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807274 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807280 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807283 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807285 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807288 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807290 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807293 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807296 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807298 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807301 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807303 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807306 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807309 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:39.808600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807312 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807314 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807318 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807322 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807325 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807327 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807330 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807333 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807335 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807338 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807340 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807342 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807345 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807348 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807350 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807353 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807355 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807357 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807360 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807362 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:39.809124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807365 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807367 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807370 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807373 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807375 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807378 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807380 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807382 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807385 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807388 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807390 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807392 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807395 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807398 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807400 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807403 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807405 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807416 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807418 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:39.809634 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807421 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807424 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807427 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807429 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807432 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807434 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807438 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807440 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807443 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807445 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807448 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807450 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807453 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807456 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807458 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807461 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807463 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807465 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807468 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807470 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:39.810108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807473 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807476 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807479 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807481 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807484 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807486 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807489 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807491 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807497 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807500 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807503 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807506 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807508 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.807511 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808625 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808638 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808643 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808648 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808652 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808655 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:11:39.810579 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808659 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808663 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808666 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808669 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808672 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808675 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808679 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808683 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808686 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808688 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808691 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808694 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808697 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808701 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808704 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808707 2576 flags.go:64] FLAG: --config-dir="" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808709 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808713 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808717 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808720 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808723 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808726 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808729 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808732 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:11:39.811147 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808735 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808737 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808741 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808745 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808748 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808751 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808754 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808756 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808759 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808763 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808766 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808768 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808771 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808774 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808777 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808780 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808783 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808786 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808789 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808791 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808794 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808797 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808800 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808803 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808806 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 20:11:39.811705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808809 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808812 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808815 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808818 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808821 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808823 2576 flags.go:64] FLAG: --help="false" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808826 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-141-145.ec2.internal" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808829 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808832 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808835 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808839 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808842 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808845 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808847 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808850 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808853 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808856 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808859 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808861 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808864 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808867 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808870 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808872 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808875 2576 flags.go:64] FLAG: --lock-file="" Apr 16 20:11:39.812299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808878 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808880 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808883 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808888 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808891 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808894 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808897 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808900 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808903 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808906 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808909 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808913 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808915 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808919 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808922 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808925 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808928 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808931 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808934 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808937 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808940 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808947 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808950 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808953 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:11:39.812855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808956 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808958 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808964 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808967 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808969 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808972 2576 flags.go:64] FLAG: --port="10250" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808988 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808991 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-099497609554c627c" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808994 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.808997 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809000 2576 flags.go:64] FLAG: --register-node="true" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809003 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809006 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809009 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809012 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809015 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809025 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809032 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809035 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809038 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809041 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809044 2576 flags.go:64] FLAG: --runonce="false" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809047 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809050 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809053 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:11:39.813434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809055 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809058 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809061 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809065 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809068 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809071 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809073 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809076 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809079 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809082 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809085 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809088 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809093 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809095 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809098 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809105 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809108 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809111 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809113 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809116 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809119 2576 flags.go:64] FLAG: --v="2" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809124 2576 flags.go:64] FLAG: --version="false" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809128 2576 flags.go:64] FLAG: --vmodule="" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809132 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.809136 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:11:39.814030 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809239 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809243 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809246 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809249 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809252 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809255 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809258 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809260 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809266 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809269 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809272 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809275 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809278 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809280 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809283 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809285 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809288 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809290 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809293 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809295 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809298 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:39.814628 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809301 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809303 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809306 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809308 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809311 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809313 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809316 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809318 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809321 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809324 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809326 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809329 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809331 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809334 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809336 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809339 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809341 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809344 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809346 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:39.815529 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809350 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809352 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809355 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809357 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809360 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809364 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809368 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809373 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809376 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809378 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809381 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809384 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809387 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809390 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809392 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809394 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809397 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809399 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809402 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:39.816289 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809406 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809409 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809412 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809414 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809417 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809419 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809422 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809424 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809426 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809429 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809431 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809434 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809436 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809440 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809442 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809445 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809448 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809450 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809453 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809456 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:39.816893 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809458 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:39.817434 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809461 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:39.817434 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809463 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:39.817434 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809466 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:39.817434 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809468 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:39.817434 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809470 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:39.817434 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.809473 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:39.817434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.810187 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:39.817739 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.817722 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:11:39.817770 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.817740 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:11:39.817802 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817786 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:39.817802 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817791 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:39.817802 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817794 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:39.817802 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817798 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:39.817802 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817800 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:39.817802 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817803 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817806 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817810 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817812 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817815 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817818 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817820 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817823 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817825 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817828 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817830 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817833 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817835 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817838 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817841 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817843 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817846 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817848 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817850 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817856 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:39.817948 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817859 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817861 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817863 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817866 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817868 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817871 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817873 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817876 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817880 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817884 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817887 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817891 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817893 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817896 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817900 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817903 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817906 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817909 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817911 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:39.818442 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817913 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817917 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817921 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817924 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817927 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817929 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817932 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817935 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817937 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817939 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817942 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817944 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817947 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817949 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817952 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817955 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817958 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817960 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817963 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:39.818889 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817965 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817968 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817970 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817973 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817990 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817993 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817996 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.817998 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818002 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818004 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818007 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818010 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818012 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818016 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818019 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818021 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818023 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818026 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818028 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818031 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:39.819380 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818033 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818036 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818038 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.818044 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818147 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818153 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818155 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818158 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818160 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818163 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818165 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818168 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818172 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818176 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818179 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:39.819843 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818182 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818185 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818188 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818191 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818193 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818196 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818199 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818202 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818205 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818208 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818211 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818214 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818217 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818219 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818222 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818225 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818227 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818230 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818232 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818235 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:39.820228 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818238 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818240 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818243 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818245 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818248 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818250 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818253 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818255 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818257 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818260 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818263 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818266 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818268 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818271 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818273 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818276 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818278 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818280 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818283 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818286 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:39.820694 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818288 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818291 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818293 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818296 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818299 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818301 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818303 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818306 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818308 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818311 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818313 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818316 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818318 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818320 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818323 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818325 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818328 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818330 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818333 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818335 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:39.821242 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818338 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818340 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818343 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818345 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818349 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818352 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818354 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818357 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818359 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818362 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818365 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818368 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818370 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818373 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.818375 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:39.821703 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.818380 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:39.822071 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.819233 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:11:39.822821 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.822807 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:11:39.825429 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.825418 2576 server.go:1019] "Starting client certificate rotation" Apr 16 20:11:39.825530 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.825514 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:39.825571 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.825553 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:39.851450 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.851432 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:39.854476 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.854459 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:39.865294 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.865277 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:11:39.870498 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.870482 2576 log.go:25] "Validated CRI v1 image API" Apr 16 20:11:39.872919 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.872906 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:11:39.876309 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.876286 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 f2b2ff41-2108-48ae-9a9f-9951bd3e2764:/dev/nvme0n1p4 f30b715c-1578-4f04-8913-e4f11ba9e95e:/dev/nvme0n1p3] Apr 16 20:11:39.876394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.876307 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:11:39.882999 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.882876 2576 manager.go:217] Machine: {Timestamp:2026-04-16 20:11:39.880342561 +0000 UTC m=+0.414698282 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200196 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d0b88da296e543c5b6397e0f21c4d SystemUUID:ec2d0b88-da29-6e54-3c5b-6397e0f21c4d BootID:f4449b31-3c8a-4e0a-9698-a707256982f6 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0d:cb:c9:c7:51 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0d:cb:c9:c7:51 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6a:76:31:71:e0:a8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:11:39.882999 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.882989 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:11:39.883130 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.883060 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:11:39.883971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.883951 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:11:39.884108 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.883973 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-145.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:11:39.884152 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.884117 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:11:39.884152 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.884125 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:11:39.884152 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.884138 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:39.884804 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.884795 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:39.885611 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.885600 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:39.885709 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.885701 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:11:39.888485 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.888476 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:11:39.888521 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.888489 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:11:39.888521 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.888504 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:11:39.888521 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.888512 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:11:39.888521 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.888520 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:11:39.888773 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.888754 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:39.890141 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.890129 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:39.890198 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.890146 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:39.893025 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.893004 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:11:39.894243 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.894230 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:11:39.895963 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.895950 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:11:39.896022 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.895972 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:11:39.896022 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.895997 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:11:39.896022 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.896006 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:11:39.896022 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.896014 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:11:39.896022 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.896022 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:11:39.896146 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.896031 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:11:39.896146 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.896039 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:11:39.896146 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.896049 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:11:39.896146 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.896060 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:11:39.896146 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.896084 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:11:39.896146 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.896099 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:11:39.897493 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.897481 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:11:39.897493 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.897494 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:11:39.900746 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.900733 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:11:39.900793 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.900768 2576 server.go:1295] "Started kubelet" Apr 16 20:11:39.901388 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.901370 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-145.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:11:39.901447 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.901375 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-145.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:11:39.901417 ip-10-0-141-145 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:11:39.901695 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.901674 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:11:39.902211 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.902043 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:11:39.902211 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.902096 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:11:39.902211 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.902057 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:11:39.903490 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.903473 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:11:39.904459 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.904448 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:11:39.909190 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.909171 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:39.909722 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.909703 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:11:39.909722 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.908799 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-145.ec2.internal.18a6ef604b2355aa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-145.ec2.internal,UID:ip-10-0-141-145.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-145.ec2.internal,},FirstTimestamp:2026-04-16 20:11:39.90074513 +0000 UTC m=+0.435100846,LastTimestamp:2026-04-16 20:11:39.90074513 +0000 UTC m=+0.435100846,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-145.ec2.internal,}" Apr 16 20:11:39.911869 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.911845 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:39.911869 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.911870 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:11:39.912021 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.911882 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:11:39.912021 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.911912 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:11:39.912021 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.911992 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:11:39.912021 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.912005 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:11:39.912197 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.912130 2576 factory.go:55] Registering systemd factory Apr 16 20:11:39.912197 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.912188 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:11:39.912406 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.912391 2576 factory.go:153] Registering CRI-O factory Apr 16 20:11:39.912406 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.912407 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 20:11:39.912492 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.912456 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:11:39.913299 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.913109 2576 factory.go:103] Registering Raw factory Apr 16 20:11:39.913511 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.913357 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 20:11:39.913511 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.913458 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:11:39.913833 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.913819 2576 manager.go:319] Starting recovery of all containers Apr 16 20:11:39.918735 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.918573 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 20:11:39.918735 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.918666 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-145.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 20:11:39.923586 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:39.923570 2576 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service/pids.max": read /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service/pids.max: no such device Apr 16 20:11:39.923875 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.923864 2576 manager.go:324] Recovery completed Apr 16 20:11:39.927509 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.927494 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:39.929588 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.929564 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:39.929672 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.929602 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:39.929672 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.929619 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:39.930055 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.930042 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:11:39.930093 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.930056 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:11:39.930093 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.930075 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:39.931641 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.931580 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-145.ec2.internal.18a6ef604cdb6be3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-145.ec2.internal,UID:ip-10-0-141-145.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-145.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-145.ec2.internal,},FirstTimestamp:2026-04-16 20:11:39.929586659 +0000 UTC m=+0.463942380,LastTimestamp:2026-04-16 20:11:39.929586659 +0000 UTC m=+0.463942380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-145.ec2.internal,}" Apr 16 20:11:39.932176 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.932164 2576 policy_none.go:49] "None policy: Start" Apr 16 20:11:39.932234 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.932180 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:11:39.932234 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.932189 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:11:39.939954 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.939897 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-145.ec2.internal.18a6ef604cdbc9d9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-145.ec2.internal,UID:ip-10-0-141-145.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-141-145.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-141-145.ec2.internal,},FirstTimestamp:2026-04-16 20:11:39.929610713 +0000 UTC m=+0.463966431,LastTimestamp:2026-04-16 20:11:39.929610713 +0000 UTC m=+0.463966431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-145.ec2.internal,}" Apr 16 20:11:39.949427 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.949408 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-b6svv" Apr 16 20:11:39.951189 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.951109 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-145.ec2.internal.18a6ef604cdbffa7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-145.ec2.internal,UID:ip-10-0-141-145.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-141-145.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-141-145.ec2.internal,},FirstTimestamp:2026-04-16 20:11:39.929624487 +0000 UTC m=+0.463980211,LastTimestamp:2026-04-16 20:11:39.929624487 +0000 UTC m=+0.463980211,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-145.ec2.internal,}" Apr 16 20:11:39.957282 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.957266 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-b6svv" Apr 16 20:11:39.966507 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.966494 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 20:11:39.981356 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.966540 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:11:39.981356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.966552 2576 server.go:85] "Starting device plugin registration server" Apr 16 20:11:39.981356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.966768 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:11:39.981356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.966779 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:11:39.981356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.966885 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:11:39.981356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.966956 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:11:39.981356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:39.966965 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:11:39.981356 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.967329 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:11:39.981356 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:39.967365 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:40.052991 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.052961 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:11:40.054089 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.054044 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:11:40.054089 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.054070 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:11:40.054089 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.054085 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:11:40.054089 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.054091 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:11:40.054297 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.054118 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:11:40.057609 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.057591 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:40.067518 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.067505 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:40.069425 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.069405 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:40.069508 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.069431 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:40.069508 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.069442 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:40.069508 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.069465 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.079158 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.079141 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.079214 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.079159 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-145.ec2.internal\": node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:40.091534 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.091515 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:40.154478 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.154443 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-145.ec2.internal"] Apr 16 20:11:40.154548 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.154517 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:40.155251 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.155232 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:40.155320 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.155257 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:40.155320 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.155266 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:40.156446 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.156435 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:40.156582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.156569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.156629 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.156593 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:40.157647 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.157628 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:40.157723 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.157654 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:40.157723 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.157665 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:40.157723 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.157665 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:40.157723 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.157682 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:40.157723 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.157692 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:40.158763 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.158750 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.158807 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.158771 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:40.159413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.159396 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:40.159490 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.159420 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:40.159490 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.159431 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:40.178488 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.178466 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-145.ec2.internal\" not found" node="ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.182224 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.182206 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-145.ec2.internal\" not found" node="ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.192117 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.192094 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:40.293094 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.293077 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:40.313547 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.313512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/215ca7d29ca438683d398763be889a08-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal\" (UID: \"215ca7d29ca438683d398763be889a08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.313547 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.313537 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/215ca7d29ca438683d398763be889a08-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal\" (UID: \"215ca7d29ca438683d398763be889a08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.313623 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.313554 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ea24cdd89c298d8269b8d7acdbb62d3a-config\") pod \"kube-apiserver-proxy-ip-10-0-141-145.ec2.internal\" (UID: \"ea24cdd89c298d8269b8d7acdbb62d3a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.393865 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.393850 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:40.414252 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.414226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/215ca7d29ca438683d398763be889a08-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal\" (UID: \"215ca7d29ca438683d398763be889a08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.414304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.414258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/215ca7d29ca438683d398763be889a08-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal\" (UID: \"215ca7d29ca438683d398763be889a08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.414304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.414272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ea24cdd89c298d8269b8d7acdbb62d3a-config\") pod \"kube-apiserver-proxy-ip-10-0-141-145.ec2.internal\" (UID: \"ea24cdd89c298d8269b8d7acdbb62d3a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.414361 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.414304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ea24cdd89c298d8269b8d7acdbb62d3a-config\") pod \"kube-apiserver-proxy-ip-10-0-141-145.ec2.internal\" (UID: \"ea24cdd89c298d8269b8d7acdbb62d3a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.414361 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.414315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/215ca7d29ca438683d398763be889a08-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal\" (UID: \"215ca7d29ca438683d398763be889a08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.414361 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.414325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/215ca7d29ca438683d398763be889a08-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal\" (UID: \"215ca7d29ca438683d398763be889a08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.481370 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.481354 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.485014 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.484995 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-145.ec2.internal" Apr 16 20:11:40.494700 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.494684 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:40.595275 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.595227 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:40.695816 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.695795 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:40.796337 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.796316 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:40.825886 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.825867 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:11:40.826321 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.825992 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:40.896643 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.896618 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:40.909885 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.909867 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:40.943575 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.943553 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:40.959714 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.959671 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:06:39 +0000 UTC" deadline="2027-11-30 17:55:18.246320658 +0000 UTC" Apr 16 20:11:40.959789 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.959709 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14229h43m37.286615687s" Apr 16 20:11:40.972921 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:40.972897 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod215ca7d29ca438683d398763be889a08.slice/crio-a25331d86c0a89dbb1f8a426926c6dba3d6f35caceacf5ed47c1de1abc091140 WatchSource:0}: Error finding container a25331d86c0a89dbb1f8a426926c6dba3d6f35caceacf5ed47c1de1abc091140: Status 404 returned error can't find the container with id a25331d86c0a89dbb1f8a426926c6dba3d6f35caceacf5ed47c1de1abc091140 Apr 16 20:11:40.973161 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:40.973142 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea24cdd89c298d8269b8d7acdbb62d3a.slice/crio-278b73248fb9067c451f4be289685d9ad49efb929b949af6244a99cf6eb4a6a6 WatchSource:0}: Error finding container 278b73248fb9067c451f4be289685d9ad49efb929b949af6244a99cf6eb4a6a6: Status 404 returned error can't find the container with id 278b73248fb9067c451f4be289685d9ad49efb929b949af6244a99cf6eb4a6a6 Apr 16 20:11:40.977429 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.977417 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:11:40.994330 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:40.994311 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-c9zvv" Apr 16 20:11:40.997420 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:40.997405 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:41.010304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.010280 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-c9zvv" Apr 16 20:11:41.056614 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.056574 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" event={"ID":"215ca7d29ca438683d398763be889a08","Type":"ContainerStarted","Data":"a25331d86c0a89dbb1f8a426926c6dba3d6f35caceacf5ed47c1de1abc091140"} Apr 16 20:11:41.057514 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.057495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-145.ec2.internal" event={"ID":"ea24cdd89c298d8269b8d7acdbb62d3a","Type":"ContainerStarted","Data":"278b73248fb9067c451f4be289685d9ad49efb929b949af6244a99cf6eb4a6a6"} Apr 16 20:11:41.097676 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:41.097652 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:41.194261 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.194219 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:41.198390 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:41.198373 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:41.298985 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:41.298964 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-145.ec2.internal\" not found" Apr 16 20:11:41.328572 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.328552 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:41.397968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.397943 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:41.410943 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.410922 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" Apr 16 20:11:41.447264 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.447203 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:41.448160 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.448136 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-145.ec2.internal" Apr 16 20:11:41.466169 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.466149 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:41.890357 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.890295 2576 apiserver.go:52] "Watching apiserver" Apr 16 20:11:41.896899 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.896875 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:11:41.897298 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.897276 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4t2rt","kube-system/konnectivity-agent-7px7k","kube-system/kube-apiserver-proxy-ip-10-0-141-145.ec2.internal","openshift-image-registry/node-ca-j59fj","openshift-multus/multus-additional-cni-plugins-xztj4","openshift-multus/multus-cmxqh","openshift-network-diagnostics/network-check-target-54bk5","openshift-network-operator/iptables-alerter-sdjtv","kube-system/global-pull-secret-syncer-gj7ch","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg","openshift-cluster-node-tuning-operator/tuned-8r25h","openshift-dns/node-resolver-pcr4b","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal","openshift-multus/network-metrics-daemon-q46p8"] Apr 16 20:11:41.900721 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.900692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:41.900825 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:41.900783 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:11:41.902384 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.902362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.904343 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.904217 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:41.904343 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:41.904288 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:11:41.905442 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.905424 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:41.905526 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.905425 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:41.905763 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.905639 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wpxkn\"" Apr 16 20:11:41.906460 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.906438 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:41.908372 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.908353 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.908685 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.908668 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:11:41.908972 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.908955 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:11:41.909067 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.909027 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:11:41.909169 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.909151 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:11:41.909238 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.909221 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-tl9k7\"" Apr 16 20:11:41.909238 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.909234 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:11:41.910369 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.910349 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:11:41.910475 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.910441 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j59fj" Apr 16 20:11:41.911195 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.911176 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-phqrk\"" Apr 16 20:11:41.912957 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.912175 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sdjtv" Apr 16 20:11:41.912957 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.912607 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:11:41.912957 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.912641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-sz2sq\"" Apr 16 20:11:41.912957 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.912803 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:11:41.913503 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.913351 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:11:41.914390 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.914063 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pcr4b" Apr 16 20:11:41.914390 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.914246 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7px7k" Apr 16 20:11:41.914390 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.914293 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:41.914563 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.914525 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vv7fc\"" Apr 16 20:11:41.914614 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.914601 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:41.914669 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.914635 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:11:41.916210 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.916172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:41.916385 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.916366 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:11:41.916745 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.916727 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:11:41.916946 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.916851 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:11:41.916946 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.916916 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pvnl8\"" Apr 16 20:11:41.917101 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.917003 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:11:41.917156 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.917102 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-h7ffq\"" Apr 16 20:11:41.918283 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.918263 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:41.919013 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.918995 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-cr48n\"" Apr 16 20:11:41.919100 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.919003 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:11:41.919318 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.919272 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:11:41.919525 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.919509 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:11:41.920211 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.920192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:41.920303 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:41.920255 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:11:41.920869 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.920672 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:11:41.920869 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.920727 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:11:41.920869 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.920752 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:11:41.920869 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.920763 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-bm2x5\"" Apr 16 20:11:41.921146 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.920971 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:11:41.921367 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.921309 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:11:41.921367 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.921201 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:11:41.923136 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-systemd\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.923223 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-lib-modules\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.923223 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-host\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.923223 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-multus-cni-dir\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.923346 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fd63426d-fc95-432e-ad91-1498d43b0e04-cni-binary-copy\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.923346 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-run-netns\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.923346 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-multus-conf-dir\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.923346 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-sysconfig\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.923346 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-os-release\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.923577 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-var-lib-cni-bin\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.923577 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fd63426d-fc95-432e-ad91-1498d43b0e04-multus-daemon-config\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.923577 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqt25\" (UniqueName: \"kubernetes.io/projected/fd63426d-fc95-432e-ad91-1498d43b0e04-kube-api-access-mqt25\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.923577 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf44a204-89e9-429e-a562-8e270453e2d3-hosts-file\") pod \"node-resolver-pcr4b\" (UID: \"cf44a204-89e9-429e-a562-8e270453e2d3\") " pod="openshift-dns/node-resolver-pcr4b" Apr 16 20:11:41.923577 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-kubernetes\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.923577 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74b468ec-61b6-4b33-96b9-598cc8545771-system-cni-dir\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:41.923577 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74b468ec-61b6-4b33-96b9-598cc8545771-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:41.923888 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-hostroot\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.923888 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923624 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-etc-kubernetes\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.923888 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923710 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74366810-9860-49b4-8e6b-049f12b47689-host-slash\") pod \"iptables-alerter-sdjtv\" (UID: \"74366810-9860-49b4-8e6b-049f12b47689\") " pod="openshift-network-operator/iptables-alerter-sdjtv" Apr 16 20:11:41.923888 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glzjq\" (UniqueName: \"kubernetes.io/projected/b94d0ed6-5b50-4eec-955c-34f48467eb09-kube-api-access-glzjq\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.923888 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923762 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74b468ec-61b6-4b33-96b9-598cc8545771-cnibin\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:41.923888 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923786 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74b468ec-61b6-4b33-96b9-598cc8545771-os-release\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:41.923888 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74b468ec-61b6-4b33-96b9-598cc8545771-cni-binary-copy\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:41.923888 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923843 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m6lt\" (UniqueName: \"kubernetes.io/projected/74b468ec-61b6-4b33-96b9-598cc8545771-kube-api-access-5m6lt\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:41.923888 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-system-cni-dir\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-run-multus-certs\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2m5g\" (UniqueName: \"kubernetes.io/projected/74366810-9860-49b4-8e6b-049f12b47689-kube-api-access-f2m5g\") pod \"iptables-alerter-sdjtv\" (UID: \"74366810-9860-49b4-8e6b-049f12b47689\") " pod="openshift-network-operator/iptables-alerter-sdjtv" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.923947 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctsdd\" (UniqueName: \"kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd\") pod \"network-check-target-54bk5\" (UID: \"958053a3-75a0-41f2-8f2b-9790c4c625d8\") " pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-tuned\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74b468ec-61b6-4b33-96b9-598cc8545771-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924047 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/74b468ec-61b6-4b33-96b9-598cc8545771-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf44a204-89e9-429e-a562-8e270453e2d3-tmp-dir\") pod \"node-resolver-pcr4b\" (UID: \"cf44a204-89e9-429e-a562-8e270453e2d3\") " pod="openshift-dns/node-resolver-pcr4b" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8czn\" (UniqueName: \"kubernetes.io/projected/cf44a204-89e9-429e-a562-8e270453e2d3-kube-api-access-x8czn\") pod \"node-resolver-pcr4b\" (UID: \"cf44a204-89e9-429e-a562-8e270453e2d3\") " pod="openshift-dns/node-resolver-pcr4b" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-modprobe-d\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-sysctl-conf\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-run\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b94d0ed6-5b50-4eec-955c-34f48467eb09-tmp\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924234 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924258 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-cnibin\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.924304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-var-lib-cni-multus\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.924935 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e2f3589-a853-4295-aedb-5f577abc797b-host\") pod \"node-ca-j59fj\" (UID: \"2e2f3589-a853-4295-aedb-5f577abc797b\") " pod="openshift-image-registry/node-ca-j59fj" Apr 16 20:11:41.924935 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-sysctl-d\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.924935 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-multus-socket-dir-parent\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.924935 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-var-lib-kubelet\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.924935 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/74366810-9860-49b4-8e6b-049f12b47689-iptables-alerter-script\") pod \"iptables-alerter-sdjtv\" (UID: \"74366810-9860-49b4-8e6b-049f12b47689\") " pod="openshift-network-operator/iptables-alerter-sdjtv" Apr 16 20:11:41.924935 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2e2f3589-a853-4295-aedb-5f577abc797b-serviceca\") pod \"node-ca-j59fj\" (UID: \"2e2f3589-a853-4295-aedb-5f577abc797b\") " pod="openshift-image-registry/node-ca-j59fj" Apr 16 20:11:41.924935 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-sys\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.924935 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924511 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-var-lib-kubelet\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:41.924935 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924537 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjx5h\" (UniqueName: \"kubernetes.io/projected/d6bfc3ff-419c-47de-880a-e6eeafc7247a-kube-api-access-sjx5h\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:41.924935 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924561 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-run-k8s-cni-cncf-io\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:41.924935 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:41.924590 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2xjj\" (UniqueName: \"kubernetes.io/projected/2e2f3589-a853-4295-aedb-5f577abc797b-kube-api-access-h2xjj\") pod \"node-ca-j59fj\" (UID: \"2e2f3589-a853-4295-aedb-5f577abc797b\") " pod="openshift-image-registry/node-ca-j59fj" Apr 16 20:11:42.010866 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.010838 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:40 +0000 UTC" deadline="2027-10-28 10:48:19.028238461 +0000 UTC" Apr 16 20:11:42.010866 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.010863 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13430h36m37.017378012s" Apr 16 20:11:42.013161 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.013144 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:11:42.025507 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-systemd\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.025596 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-lib-modules\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.025596 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-multus-cni-dir\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.025596 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fd63426d-fc95-432e-ad91-1498d43b0e04-cni-binary-copy\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.025753 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-systemd\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.025753 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-multus-cni-dir\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.025753 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-multus-conf-dir\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.025753 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-multus-conf-dir\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.025753 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025716 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-etc-selinux\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.025968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-lib-modules\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.025968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-sys-fs\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.025968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-run-openvswitch\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.025968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-sysconfig\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.025968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-os-release\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.025968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-sysconfig\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.025968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-var-lib-cni-bin\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.025968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-socket-dir\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.025968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-os-release\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.025968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-log-socket\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.025966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-var-lib-cni-bin\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74b468ec-61b6-4b33-96b9-598cc8545771-system-cni-dir\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-hostroot\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74b468ec-61b6-4b33-96b9-598cc8545771-system-cni-dir\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74b468ec-61b6-4b33-96b9-598cc8545771-os-release\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026086 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-hostroot\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6lt\" (UniqueName: \"kubernetes.io/projected/74b468ec-61b6-4b33-96b9-598cc8545771-kube-api-access-5m6lt\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fd63426d-fc95-432e-ad91-1498d43b0e04-cni-binary-copy\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74b468ec-61b6-4b33-96b9-598cc8545771-os-release\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026125 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-system-cni-dir\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-run-multus-certs\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2m5g\" (UniqueName: \"kubernetes.io/projected/74366810-9860-49b4-8e6b-049f12b47689-kube-api-access-f2m5g\") pod \"iptables-alerter-sdjtv\" (UID: \"74366810-9860-49b4-8e6b-049f12b47689\") " pod="openshift-network-operator/iptables-alerter-sdjtv" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026181 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-system-cni-dir\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-var-lib-openvswitch\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-run-multus-certs\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-cni-bin\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.026394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8346d3a3-b207-4963-8e85-a3cae84eb2ae-konnectivity-ca\") pod \"konnectivity-agent-7px7k\" (UID: \"8346d3a3-b207-4963-8e85-a3cae84eb2ae\") " pod="kube-system/konnectivity-agent-7px7k" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-tuned\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74b468ec-61b6-4b33-96b9-598cc8545771-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/74b468ec-61b6-4b33-96b9-598cc8545771-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026456 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-run-systemd\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13dc8612-6ea4-453f-8d47-52023be79bf8-ovnkube-config\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026547 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74b468ec-61b6-4b33-96b9-598cc8545771-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b94d0ed6-5b50-4eec-955c-34f48467eb09-tmp\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-var-lib-cni-multus\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twt8t\" (UniqueName: \"kubernetes.io/projected/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-kube-api-access-twt8t\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-run-ovn\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-node-log\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13dc8612-6ea4-453f-8d47-52023be79bf8-ovn-node-metrics-cert\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026690 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-var-lib-cni-multus\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-sysctl-d\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.027163 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-multus-socket-dir-parent\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/74366810-9860-49b4-8e6b-049f12b47689-iptables-alerter-script\") pod \"iptables-alerter-sdjtv\" (UID: \"74366810-9860-49b4-8e6b-049f12b47689\") " pod="openshift-network-operator/iptables-alerter-sdjtv" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-multus-socket-dir-parent\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026840 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-sysctl-d\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-systemd-units\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026916 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-run-netns\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.026950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/299edf22-1b19-43c6-aa15-8124389d617a-dbus\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/74b468ec-61b6-4b33-96b9-598cc8545771-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/74366810-9860-49b4-8e6b-049f12b47689-iptables-alerter-script\") pod \"iptables-alerter-sdjtv\" (UID: \"74366810-9860-49b4-8e6b-049f12b47689\") " pod="openshift-network-operator/iptables-alerter-sdjtv" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-var-lib-kubelet\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-host\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-run-netns\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027608 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-var-lib-kubelet\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf44a204-89e9-429e-a562-8e270453e2d3-hosts-file\") pod \"node-resolver-pcr4b\" (UID: \"cf44a204-89e9-429e-a562-8e270453e2d3\") " pod="openshift-dns/node-resolver-pcr4b" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8346d3a3-b207-4963-8e85-a3cae84eb2ae-agent-certs\") pod \"konnectivity-agent-7px7k\" (UID: \"8346d3a3-b207-4963-8e85-a3cae84eb2ae\") " pod="kube-system/konnectivity-agent-7px7k" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-run-netns\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf44a204-89e9-429e-a562-8e270453e2d3-hosts-file\") pod \"node-resolver-pcr4b\" (UID: \"cf44a204-89e9-429e-a562-8e270453e2d3\") " pod="openshift-dns/node-resolver-pcr4b" Apr 16 20:11:42.027842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-host\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fd63426d-fc95-432e-ad91-1498d43b0e04-multus-daemon-config\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqt25\" (UniqueName: \"kubernetes.io/projected/fd63426d-fc95-432e-ad91-1498d43b0e04-kube-api-access-mqt25\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-slash\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-kubernetes\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74b468ec-61b6-4b33-96b9-598cc8545771-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-etc-kubernetes\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74366810-9860-49b4-8e6b-049f12b47689-host-slash\") pod \"iptables-alerter-sdjtv\" (UID: \"74366810-9860-49b4-8e6b-049f12b47689\") " pod="openshift-network-operator/iptables-alerter-sdjtv" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.027949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glzjq\" (UniqueName: \"kubernetes.io/projected/b94d0ed6-5b50-4eec-955c-34f48467eb09-kube-api-access-glzjq\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74b468ec-61b6-4b33-96b9-598cc8545771-cnibin\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74b468ec-61b6-4b33-96b9-598cc8545771-cni-binary-copy\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf44a204-89e9-429e-a562-8e270453e2d3-tmp-dir\") pod \"node-resolver-pcr4b\" (UID: \"cf44a204-89e9-429e-a562-8e270453e2d3\") " pod="openshift-dns/node-resolver-pcr4b" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028084 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-run-ovn-kubernetes\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13dc8612-6ea4-453f-8d47-52023be79bf8-env-overrides\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74366810-9860-49b4-8e6b-049f12b47689-host-slash\") pod \"iptables-alerter-sdjtv\" (UID: \"74366810-9860-49b4-8e6b-049f12b47689\") " pod="openshift-network-operator/iptables-alerter-sdjtv" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dht6\" (UniqueName: \"kubernetes.io/projected/13dc8612-6ea4-453f-8d47-52023be79bf8-kube-api-access-5dht6\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/299edf22-1b19-43c6-aa15-8124389d617a-kubelet-config\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:42.028582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-kubernetes\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctsdd\" (UniqueName: \"kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd\") pod \"network-check-target-54bk5\" (UID: \"958053a3-75a0-41f2-8f2b-9790c4c625d8\") " pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8czn\" (UniqueName: \"kubernetes.io/projected/cf44a204-89e9-429e-a562-8e270453e2d3-kube-api-access-x8czn\") pod \"node-resolver-pcr4b\" (UID: \"cf44a204-89e9-429e-a562-8e270453e2d3\") " pod="openshift-dns/node-resolver-pcr4b" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fd63426d-fc95-432e-ad91-1498d43b0e04-multus-daemon-config\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-device-dir\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13dc8612-6ea4-453f-8d47-52023be79bf8-ovnkube-script-lib\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-etc-kubernetes\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-modprobe-d\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028393 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-sysctl-conf\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-run\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-cnibin\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e2f3589-a853-4295-aedb-5f577abc797b-host\") pod \"node-ca-j59fj\" (UID: \"2e2f3589-a853-4295-aedb-5f577abc797b\") " pod="openshift-image-registry/node-ca-j59fj" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2e2f3589-a853-4295-aedb-5f577abc797b-serviceca\") pod \"node-ca-j59fj\" (UID: \"2e2f3589-a853-4295-aedb-5f577abc797b\") " pod="openshift-image-registry/node-ca-j59fj" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-kubelet\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-var-lib-kubelet\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028639 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-cni-netd\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.029356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74b468ec-61b6-4b33-96b9-598cc8545771-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-sys\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjx5h\" (UniqueName: \"kubernetes.io/projected/d6bfc3ff-419c-47de-880a-e6eeafc7247a-kube-api-access-sjx5h\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-run\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-run-k8s-cni-cncf-io\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74b468ec-61b6-4b33-96b9-598cc8545771-cnibin\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.028950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2xjj\" (UniqueName: \"kubernetes.io/projected/2e2f3589-a853-4295-aedb-5f577abc797b-kube-api-access-h2xjj\") pod \"node-ca-j59fj\" (UID: \"2e2f3589-a853-4295-aedb-5f577abc797b\") " pod="openshift-image-registry/node-ca-j59fj" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-registration-dir\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.029105 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-etc-openvswitch\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.029189 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs podName:d6bfc3ff-419c-47de-880a-e6eeafc7247a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:42.529156366 +0000 UTC m=+3.063512092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs") pod "network-metrics-daemon-q46p8" (UID: "d6bfc3ff-419c-47de-880a-e6eeafc7247a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-run-k8s-cni-cncf-io\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74b468ec-61b6-4b33-96b9-598cc8545771-cni-binary-copy\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029399 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-cnibin\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e2f3589-a853-4295-aedb-5f577abc797b-host\") pod \"node-ca-j59fj\" (UID: \"2e2f3589-a853-4295-aedb-5f577abc797b\") " pod="openshift-image-registry/node-ca-j59fj" Apr 16 20:11:42.030125 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-modprobe-d\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.030937 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd63426d-fc95-432e-ad91-1498d43b0e04-host-var-lib-kubelet\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.030937 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-sys\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.030937 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-sysctl-conf\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.030937 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.029671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf44a204-89e9-429e-a562-8e270453e2d3-tmp-dir\") pod \"node-resolver-pcr4b\" (UID: \"cf44a204-89e9-429e-a562-8e270453e2d3\") " pod="openshift-dns/node-resolver-pcr4b" Apr 16 20:11:42.030937 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.030158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b94d0ed6-5b50-4eec-955c-34f48467eb09-etc-tuned\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.030937 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.030484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2e2f3589-a853-4295-aedb-5f577abc797b-serviceca\") pod \"node-ca-j59fj\" (UID: \"2e2f3589-a853-4295-aedb-5f577abc797b\") " pod="openshift-image-registry/node-ca-j59fj" Apr 16 20:11:42.030937 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.030592 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b94d0ed6-5b50-4eec-955c-34f48467eb09-tmp\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.037541 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.037519 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:42.037639 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.037545 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:42.037639 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.037578 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ctsdd for pod openshift-network-diagnostics/network-check-target-54bk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:42.037766 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.037661 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd podName:958053a3-75a0-41f2-8f2b-9790c4c625d8 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:42.537623862 +0000 UTC m=+3.071979593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ctsdd" (UniqueName: "kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd") pod "network-check-target-54bk5" (UID: "958053a3-75a0-41f2-8f2b-9790c4c625d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:42.040055 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.039801 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m6lt\" (UniqueName: \"kubernetes.io/projected/74b468ec-61b6-4b33-96b9-598cc8545771-kube-api-access-5m6lt\") pod \"multus-additional-cni-plugins-xztj4\" (UID: \"74b468ec-61b6-4b33-96b9-598cc8545771\") " pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.040055 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.039849 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8czn\" (UniqueName: \"kubernetes.io/projected/cf44a204-89e9-429e-a562-8e270453e2d3-kube-api-access-x8czn\") pod \"node-resolver-pcr4b\" (UID: \"cf44a204-89e9-429e-a562-8e270453e2d3\") " pod="openshift-dns/node-resolver-pcr4b" Apr 16 20:11:42.040055 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.039962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2m5g\" (UniqueName: \"kubernetes.io/projected/74366810-9860-49b4-8e6b-049f12b47689-kube-api-access-f2m5g\") pod \"iptables-alerter-sdjtv\" (UID: \"74366810-9860-49b4-8e6b-049f12b47689\") " pod="openshift-network-operator/iptables-alerter-sdjtv" Apr 16 20:11:42.040799 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.040774 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqt25\" (UniqueName: \"kubernetes.io/projected/fd63426d-fc95-432e-ad91-1498d43b0e04-kube-api-access-mqt25\") pod \"multus-cmxqh\" (UID: \"fd63426d-fc95-432e-ad91-1498d43b0e04\") " pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.041191 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.041145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjx5h\" (UniqueName: \"kubernetes.io/projected/d6bfc3ff-419c-47de-880a-e6eeafc7247a-kube-api-access-sjx5h\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:42.041458 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.041438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glzjq\" (UniqueName: \"kubernetes.io/projected/b94d0ed6-5b50-4eec-955c-34f48467eb09-kube-api-access-glzjq\") pod \"tuned-8r25h\" (UID: \"b94d0ed6-5b50-4eec-955c-34f48467eb09\") " pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.041510 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.041477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2xjj\" (UniqueName: \"kubernetes.io/projected/2e2f3589-a853-4295-aedb-5f577abc797b-kube-api-access-h2xjj\") pod \"node-ca-j59fj\" (UID: \"2e2f3589-a853-4295-aedb-5f577abc797b\") " pod="openshift-image-registry/node-ca-j59fj" Apr 16 20:11:42.103674 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.103644 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:42.129911 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.129886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-registration-dir\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.130030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.129924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-etc-openvswitch\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.130030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.129950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:42.130030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.129964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-registration-dir\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.130030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.129991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-etc-selinux\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.130181 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130038 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-etc-openvswitch\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.130181 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.130083 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:42.130181 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-sys-fs\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.130181 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130130 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-sys-fs\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.130181 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-etc-selinux\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.130181 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.130163 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret podName:299edf22-1b19-43c6-aa15-8124389d617a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:42.630144644 +0000 UTC m=+3.164500363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret") pod "global-pull-secret-syncer-gj7ch" (UID: "299edf22-1b19-43c6-aa15-8124389d617a") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-run-openvswitch\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-socket-dir\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-log-socket\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130269 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-run-openvswitch\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-var-lib-openvswitch\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-cni-bin\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-log-socket\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8346d3a3-b207-4963-8e85-a3cae84eb2ae-konnectivity-ca\") pod \"konnectivity-agent-7px7k\" (UID: \"8346d3a3-b207-4963-8e85-a3cae84eb2ae\") " pod="kube-system/konnectivity-agent-7px7k" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-socket-dir\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-run-systemd\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-var-lib-openvswitch\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-run-systemd\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.130413 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130392 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-cni-bin\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13dc8612-6ea4-453f-8d47-52023be79bf8-ovnkube-config\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twt8t\" (UniqueName: \"kubernetes.io/projected/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-kube-api-access-twt8t\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-run-ovn\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-node-log\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13dc8612-6ea4-453f-8d47-52023be79bf8-ovn-node-metrics-cert\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-systemd-units\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-run-ovn\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-node-log\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-run-netns\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/299edf22-1b19-43c6-aa15-8124389d617a-dbus\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-systemd-units\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8346d3a3-b207-4963-8e85-a3cae84eb2ae-agent-certs\") pod \"konnectivity-agent-7px7k\" (UID: \"8346d3a3-b207-4963-8e85-a3cae84eb2ae\") " pod="kube-system/konnectivity-agent-7px7k" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-run-netns\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-slash\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-run-ovn-kubernetes\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13dc8612-6ea4-453f-8d47-52023be79bf8-env-overrides\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dht6\" (UniqueName: \"kubernetes.io/projected/13dc8612-6ea4-453f-8d47-52023be79bf8-kube-api-access-5dht6\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8346d3a3-b207-4963-8e85-a3cae84eb2ae-konnectivity-ca\") pod \"konnectivity-agent-7px7k\" (UID: \"8346d3a3-b207-4963-8e85-a3cae84eb2ae\") " pod="kube-system/konnectivity-agent-7px7k" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/299edf22-1b19-43c6-aa15-8124389d617a-kubelet-config\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/299edf22-1b19-43c6-aa15-8124389d617a-dbus\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-device-dir\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-run-ovn-kubernetes\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13dc8612-6ea4-453f-8d47-52023be79bf8-ovnkube-script-lib\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130945 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-kubelet\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-slash\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.130969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-cni-netd\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.131030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-cni-netd\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.131038 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-device-dir\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.131076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13dc8612-6ea4-453f-8d47-52023be79bf8-host-kubelet\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.131125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/299edf22-1b19-43c6-aa15-8124389d617a-kubelet-config\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.131233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13dc8612-6ea4-453f-8d47-52023be79bf8-env-overrides\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.131279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.131693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.131346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.132250 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.131424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13dc8612-6ea4-453f-8d47-52023be79bf8-ovnkube-config\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.132250 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.131523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13dc8612-6ea4-453f-8d47-52023be79bf8-ovnkube-script-lib\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.133201 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.133167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13dc8612-6ea4-453f-8d47-52023be79bf8-ovn-node-metrics-cert\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.133743 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.133720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8346d3a3-b207-4963-8e85-a3cae84eb2ae-agent-certs\") pod \"konnectivity-agent-7px7k\" (UID: \"8346d3a3-b207-4963-8e85-a3cae84eb2ae\") " pod="kube-system/konnectivity-agent-7px7k" Apr 16 20:11:42.138384 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.138364 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twt8t\" (UniqueName: \"kubernetes.io/projected/1c8b43bd-c782-4cdf-bbc1-459c65b0298b-kube-api-access-twt8t\") pod \"aws-ebs-csi-driver-node-cgmkg\" (UID: \"1c8b43bd-c782-4cdf-bbc1-459c65b0298b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.139246 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.139225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dht6\" (UniqueName: \"kubernetes.io/projected/13dc8612-6ea4-453f-8d47-52023be79bf8-kube-api-access-5dht6\") pod \"ovnkube-node-4t2rt\" (UID: \"13dc8612-6ea4-453f-8d47-52023be79bf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.217165 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.217108 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8r25h" Apr 16 20:11:42.225787 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.225764 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xztj4" Apr 16 20:11:42.235466 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.235450 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cmxqh" Apr 16 20:11:42.238964 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.238948 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j59fj" Apr 16 20:11:42.246502 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.246488 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sdjtv" Apr 16 20:11:42.252949 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.252934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pcr4b" Apr 16 20:11:42.259504 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.259486 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7px7k" Apr 16 20:11:42.266018 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.265993 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" Apr 16 20:11:42.271554 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.271538 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:11:42.533846 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.533767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:42.534001 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.533891 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:42.534001 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.533957 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs podName:d6bfc3ff-419c-47de-880a-e6eeafc7247a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:43.533939376 +0000 UTC m=+4.068295102 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs") pod "network-metrics-daemon-q46p8" (UID: "d6bfc3ff-419c-47de-880a-e6eeafc7247a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:42.609917 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:42.609895 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13dc8612_6ea4_453f_8d47_52023be79bf8.slice/crio-8bd1d1b2d2e185db6e953ca785c497c542d0ca5409787d0628d3dfb85fb37ce1 WatchSource:0}: Error finding container 8bd1d1b2d2e185db6e953ca785c497c542d0ca5409787d0628d3dfb85fb37ce1: Status 404 returned error can't find the container with id 8bd1d1b2d2e185db6e953ca785c497c542d0ca5409787d0628d3dfb85fb37ce1 Apr 16 20:11:42.611850 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:42.611816 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf44a204_89e9_429e_a562_8e270453e2d3.slice/crio-b415a152eb855f5cb76b6b1c97050bcb28ca4d0f97d4dbb57d7ad7fff522a3bb WatchSource:0}: Error finding container b415a152eb855f5cb76b6b1c97050bcb28ca4d0f97d4dbb57d7ad7fff522a3bb: Status 404 returned error can't find the container with id b415a152eb855f5cb76b6b1c97050bcb28ca4d0f97d4dbb57d7ad7fff522a3bb Apr 16 20:11:42.613316 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:42.613292 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd63426d_fc95_432e_ad91_1498d43b0e04.slice/crio-3721f1440e3e1f7789efcfc0d9951246b92c3dead0cd6b8df4c10db3fe7092cf WatchSource:0}: Error finding container 3721f1440e3e1f7789efcfc0d9951246b92c3dead0cd6b8df4c10db3fe7092cf: Status 404 returned error can't find the container with id 3721f1440e3e1f7789efcfc0d9951246b92c3dead0cd6b8df4c10db3fe7092cf Apr 16 20:11:42.615548 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:42.615528 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74b468ec_61b6_4b33_96b9_598cc8545771.slice/crio-f8d8a714589da2799fba65fef6b818a0dd2d085095a6fc669578955145a76536 WatchSource:0}: Error finding container f8d8a714589da2799fba65fef6b818a0dd2d085095a6fc669578955145a76536: Status 404 returned error can't find the container with id f8d8a714589da2799fba65fef6b818a0dd2d085095a6fc669578955145a76536 Apr 16 20:11:42.616402 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:42.616384 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e2f3589_a853_4295_aedb_5f577abc797b.slice/crio-7320cc97036e378816048c9b79f76479b478341a45273436b23ebaccf907207c WatchSource:0}: Error finding container 7320cc97036e378816048c9b79f76479b478341a45273436b23ebaccf907207c: Status 404 returned error can't find the container with id 7320cc97036e378816048c9b79f76479b478341a45273436b23ebaccf907207c Apr 16 20:11:42.618053 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:42.618007 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c8b43bd_c782_4cdf_bbc1_459c65b0298b.slice/crio-e4a11255b512c67683c574b50a8fc090425971138d90808e3803bd84d6d51b4e WatchSource:0}: Error finding container e4a11255b512c67683c574b50a8fc090425971138d90808e3803bd84d6d51b4e: Status 404 returned error can't find the container with id e4a11255b512c67683c574b50a8fc090425971138d90808e3803bd84d6d51b4e Apr 16 20:11:42.618374 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:42.618267 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74366810_9860_49b4_8e6b_049f12b47689.slice/crio-9f2939319ecbe7821cfcfd763acc8198033fe84cfad477763dd9abb3f5c4eb3d WatchSource:0}: Error finding container 9f2939319ecbe7821cfcfd763acc8198033fe84cfad477763dd9abb3f5c4eb3d: Status 404 returned error can't find the container with id 9f2939319ecbe7821cfcfd763acc8198033fe84cfad477763dd9abb3f5c4eb3d Apr 16 20:11:42.619649 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:42.619592 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8346d3a3_b207_4963_8e85_a3cae84eb2ae.slice/crio-bece86f0aa6f607b9366f31bcf4427519263295075dcebd562697711aa4af6bb WatchSource:0}: Error finding container bece86f0aa6f607b9366f31bcf4427519263295075dcebd562697711aa4af6bb: Status 404 returned error can't find the container with id bece86f0aa6f607b9366f31bcf4427519263295075dcebd562697711aa4af6bb Apr 16 20:11:42.620096 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:11:42.620075 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb94d0ed6_5b50_4eec_955c_34f48467eb09.slice/crio-1ab78e90463697d9641079a7a68bb00e7c9baa93e9e0817e4c1b11f6470a1f8d WatchSource:0}: Error finding container 1ab78e90463697d9641079a7a68bb00e7c9baa93e9e0817e4c1b11f6470a1f8d: Status 404 returned error can't find the container with id 1ab78e90463697d9641079a7a68bb00e7c9baa93e9e0817e4c1b11f6470a1f8d Apr 16 20:11:42.634411 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.634275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctsdd\" (UniqueName: \"kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd\") pod \"network-check-target-54bk5\" (UID: \"958053a3-75a0-41f2-8f2b-9790c4c625d8\") " pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:42.634411 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:42.634351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:42.634564 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.634534 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:42.634564 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.634556 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:42.634660 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.634570 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ctsdd for pod openshift-network-diagnostics/network-check-target-54bk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:42.634660 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.634577 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:42.634660 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.634633 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd podName:958053a3-75a0-41f2-8f2b-9790c4c625d8 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:43.634614229 +0000 UTC m=+4.168970040 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ctsdd" (UniqueName: "kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd") pod "network-check-target-54bk5" (UID: "958053a3-75a0-41f2-8f2b-9790c4c625d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:42.634660 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:42.634654 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret podName:299edf22-1b19-43c6-aa15-8124389d617a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:43.634645398 +0000 UTC m=+4.169001105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret") pod "global-pull-secret-syncer-gj7ch" (UID: "299edf22-1b19-43c6-aa15-8124389d617a") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:43.012031 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.011768 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:40 +0000 UTC" deadline="2028-01-01 08:55:50.50253744 +0000 UTC" Apr 16 20:11:43.012031 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.011948 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14988h44m7.490593565s" Apr 16 20:11:43.060773 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.060740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xztj4" event={"ID":"74b468ec-61b6-4b33-96b9-598cc8545771","Type":"ContainerStarted","Data":"f8d8a714589da2799fba65fef6b818a0dd2d085095a6fc669578955145a76536"} Apr 16 20:11:43.063114 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.063085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-145.ec2.internal" event={"ID":"ea24cdd89c298d8269b8d7acdbb62d3a","Type":"ContainerStarted","Data":"ea0cb4fbd494312355df02cc8df07e300a0a38ebc9252d6a1897d0dd68c8eadb"} Apr 16 20:11:43.065156 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.064542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7px7k" event={"ID":"8346d3a3-b207-4963-8e85-a3cae84eb2ae","Type":"ContainerStarted","Data":"bece86f0aa6f607b9366f31bcf4427519263295075dcebd562697711aa4af6bb"} Apr 16 20:11:43.066454 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.066422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j59fj" event={"ID":"2e2f3589-a853-4295-aedb-5f577abc797b","Type":"ContainerStarted","Data":"7320cc97036e378816048c9b79f76479b478341a45273436b23ebaccf907207c"} Apr 16 20:11:43.070077 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.070026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmxqh" event={"ID":"fd63426d-fc95-432e-ad91-1498d43b0e04","Type":"ContainerStarted","Data":"3721f1440e3e1f7789efcfc0d9951246b92c3dead0cd6b8df4c10db3fe7092cf"} Apr 16 20:11:43.074297 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.074273 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pcr4b" event={"ID":"cf44a204-89e9-429e-a562-8e270453e2d3","Type":"ContainerStarted","Data":"b415a152eb855f5cb76b6b1c97050bcb28ca4d0f97d4dbb57d7ad7fff522a3bb"} Apr 16 20:11:43.078173 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.078149 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" event={"ID":"13dc8612-6ea4-453f-8d47-52023be79bf8","Type":"ContainerStarted","Data":"8bd1d1b2d2e185db6e953ca785c497c542d0ca5409787d0628d3dfb85fb37ce1"} Apr 16 20:11:43.081151 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.081128 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8r25h" event={"ID":"b94d0ed6-5b50-4eec-955c-34f48467eb09","Type":"ContainerStarted","Data":"1ab78e90463697d9641079a7a68bb00e7c9baa93e9e0817e4c1b11f6470a1f8d"} Apr 16 20:11:43.086472 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.086297 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sdjtv" event={"ID":"74366810-9860-49b4-8e6b-049f12b47689","Type":"ContainerStarted","Data":"9f2939319ecbe7821cfcfd763acc8198033fe84cfad477763dd9abb3f5c4eb3d"} Apr 16 20:11:43.087770 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.087719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" event={"ID":"1c8b43bd-c782-4cdf-bbc1-459c65b0298b","Type":"ContainerStarted","Data":"e4a11255b512c67683c574b50a8fc090425971138d90808e3803bd84d6d51b4e"} Apr 16 20:11:43.541807 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.541736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:43.541935 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:43.541877 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:43.541935 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:43.541933 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs podName:d6bfc3ff-419c-47de-880a-e6eeafc7247a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:45.541914495 +0000 UTC m=+6.076270207 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs") pod "network-metrics-daemon-q46p8" (UID: "d6bfc3ff-419c-47de-880a-e6eeafc7247a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:43.642935 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.642902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:43.643100 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:43.642992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctsdd\" (UniqueName: \"kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd\") pod \"network-check-target-54bk5\" (UID: \"958053a3-75a0-41f2-8f2b-9790c4c625d8\") " pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:43.643170 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:43.643125 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:43.643170 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:43.643141 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:43.643170 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:43.643153 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ctsdd for pod openshift-network-diagnostics/network-check-target-54bk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:43.643316 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:43.643201 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd podName:958053a3-75a0-41f2-8f2b-9790c4c625d8 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:45.643185028 +0000 UTC m=+6.177540734 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ctsdd" (UniqueName: "kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd") pod "network-check-target-54bk5" (UID: "958053a3-75a0-41f2-8f2b-9790c4c625d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:43.643597 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:43.643580 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:43.643703 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:43.643628 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret podName:299edf22-1b19-43c6-aa15-8124389d617a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:45.643614112 +0000 UTC m=+6.177969820 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret") pod "global-pull-secret-syncer-gj7ch" (UID: "299edf22-1b19-43c6-aa15-8124389d617a") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:44.055789 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:44.055131 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:44.055789 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:44.055249 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:11:44.055789 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:44.055661 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:44.055789 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:44.055748 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:11:44.058319 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:44.057181 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:44.058319 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:44.057291 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:11:44.105088 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:44.104607 2576 generic.go:358] "Generic (PLEG): container finished" podID="215ca7d29ca438683d398763be889a08" containerID="45d8c9fb624560aa3bdd15ad333f83164ec694c1d8d67665865b3c2c25720baf" exitCode=0 Apr 16 20:11:44.105088 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:44.104713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" event={"ID":"215ca7d29ca438683d398763be889a08","Type":"ContainerDied","Data":"45d8c9fb624560aa3bdd15ad333f83164ec694c1d8d67665865b3c2c25720baf"} Apr 16 20:11:44.125768 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:44.124650 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-145.ec2.internal" podStartSLOduration=3.124634596 podStartE2EDuration="3.124634596s" podCreationTimestamp="2026-04-16 20:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:43.078280291 +0000 UTC m=+3.612636018" watchObservedRunningTime="2026-04-16 20:11:44.124634596 +0000 UTC m=+4.658990323" Apr 16 20:11:45.115737 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:45.115088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" event={"ID":"215ca7d29ca438683d398763be889a08","Type":"ContainerStarted","Data":"56f443112c53cf967e90bd346caa3e46a40ac56e46f20519d84760249bab9a4a"} Apr 16 20:11:45.130789 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:45.130309 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-145.ec2.internal" podStartSLOduration=4.130291614 podStartE2EDuration="4.130291614s" podCreationTimestamp="2026-04-16 20:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:45.129500912 +0000 UTC m=+5.663856642" watchObservedRunningTime="2026-04-16 20:11:45.130291614 +0000 UTC m=+5.664647342" Apr 16 20:11:45.562923 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:45.562836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:45.563103 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:45.563023 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:45.563103 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:45.563088 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs podName:d6bfc3ff-419c-47de-880a-e6eeafc7247a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:49.563069957 +0000 UTC m=+10.097425663 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs") pod "network-metrics-daemon-q46p8" (UID: "d6bfc3ff-419c-47de-880a-e6eeafc7247a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:45.664869 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:45.664147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctsdd\" (UniqueName: \"kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd\") pod \"network-check-target-54bk5\" (UID: \"958053a3-75a0-41f2-8f2b-9790c4c625d8\") " pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:45.664869 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:45.664218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:45.664869 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:45.664355 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:45.664869 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:45.664409 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret podName:299edf22-1b19-43c6-aa15-8124389d617a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:49.66439196 +0000 UTC m=+10.198747670 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret") pod "global-pull-secret-syncer-gj7ch" (UID: "299edf22-1b19-43c6-aa15-8124389d617a") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:45.665457 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:45.665425 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:45.665542 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:45.665455 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:45.665542 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:45.665492 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ctsdd for pod openshift-network-diagnostics/network-check-target-54bk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:45.665626 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:45.665567 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd podName:958053a3-75a0-41f2-8f2b-9790c4c625d8 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:49.66554234 +0000 UTC m=+10.199898049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ctsdd" (UniqueName: "kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd") pod "network-check-target-54bk5" (UID: "958053a3-75a0-41f2-8f2b-9790c4c625d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:46.057824 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:46.057754 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:46.057989 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:46.057843 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:11:46.058248 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:46.058228 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:46.058334 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:46.058315 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:11:46.058456 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:46.058440 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:46.058558 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:46.058527 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:11:48.057751 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:48.057724 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:48.058192 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:48.057724 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:48.058192 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:48.057857 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:11:48.058192 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:48.057724 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:48.058192 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:48.057922 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:11:48.058192 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:48.058024 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:11:49.594539 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:49.594334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:49.594539 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:49.594516 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:49.595018 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:49.594579 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs podName:d6bfc3ff-419c-47de-880a-e6eeafc7247a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:57.594563133 +0000 UTC m=+18.128918838 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs") pod "network-metrics-daemon-q46p8" (UID: "d6bfc3ff-419c-47de-880a-e6eeafc7247a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:49.695281 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:49.695204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctsdd\" (UniqueName: \"kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd\") pod \"network-check-target-54bk5\" (UID: \"958053a3-75a0-41f2-8f2b-9790c4c625d8\") " pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:49.695281 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:49.695268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:49.695462 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:49.695390 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:49.695462 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:49.695407 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:49.695462 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:49.695411 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:49.695462 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:49.695427 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ctsdd for pod openshift-network-diagnostics/network-check-target-54bk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:49.695651 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:49.695466 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret podName:299edf22-1b19-43c6-aa15-8124389d617a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:57.695449054 +0000 UTC m=+18.229804761 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret") pod "global-pull-secret-syncer-gj7ch" (UID: "299edf22-1b19-43c6-aa15-8124389d617a") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:49.695651 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:49.695485 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd podName:958053a3-75a0-41f2-8f2b-9790c4c625d8 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:57.695474608 +0000 UTC m=+18.229830313 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ctsdd" (UniqueName: "kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd") pod "network-check-target-54bk5" (UID: "958053a3-75a0-41f2-8f2b-9790c4c625d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:50.056636 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:50.055908 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:50.056636 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:50.056029 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:11:50.056636 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:50.056357 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:50.056636 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:50.056443 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:11:50.056636 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:50.056486 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:50.056636 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:50.056562 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:11:52.054749 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.054519 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:52.055211 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.054528 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:52.055211 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:52.054904 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:11:52.055211 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:52.054961 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:11:52.055211 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.054559 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:52.055211 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:52.055077 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:11:52.128789 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.128756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8r25h" event={"ID":"b94d0ed6-5b50-4eec-955c-34f48467eb09","Type":"ContainerStarted","Data":"d7776668f0a8df8172c93dc7733dd5a11be78b4c33f71cf4bd50a9059bd184f0"} Apr 16 20:11:52.130356 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.130326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" event={"ID":"1c8b43bd-c782-4cdf-bbc1-459c65b0298b","Type":"ContainerStarted","Data":"d7ffdafd5fc5615d40ec0b56cbb9e2c009230c90d41484398ca32839ca9fe0ea"} Apr 16 20:11:52.131850 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.131813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xztj4" event={"ID":"74b468ec-61b6-4b33-96b9-598cc8545771","Type":"ContainerStarted","Data":"bdbedcaf2dd7c5e7c7464fc6811421c9168f11626f273fed41ce08b97c8e212c"} Apr 16 20:11:52.133260 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.133234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7px7k" event={"ID":"8346d3a3-b207-4963-8e85-a3cae84eb2ae","Type":"ContainerStarted","Data":"5bf8a61277d3d1b14a02d6776e25839eb809cc96a70f142bcc8ab67f3850f1eb"} Apr 16 20:11:52.135305 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.134944 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j59fj" event={"ID":"2e2f3589-a853-4295-aedb-5f577abc797b","Type":"ContainerStarted","Data":"f809ee2b448313a639fab33068ac2208b7ff4afa97b6c6b7779b8a2809333e85"} Apr 16 20:11:52.136572 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.136503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pcr4b" event={"ID":"cf44a204-89e9-429e-a562-8e270453e2d3","Type":"ContainerStarted","Data":"fe30958ad986ac7f24e4d5ea5df6ad10a5f5fdb95ba52e545ed3d49b52d3326f"} Apr 16 20:11:52.142536 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.142496 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8r25h" podStartSLOduration=3.6273717420000002 podStartE2EDuration="12.142485246s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:42.622138387 +0000 UTC m=+3.156494096" lastFinishedPulling="2026-04-16 20:11:51.137251892 +0000 UTC m=+11.671607600" observedRunningTime="2026-04-16 20:11:52.142156764 +0000 UTC m=+12.676512491" watchObservedRunningTime="2026-04-16 20:11:52.142485246 +0000 UTC m=+12.676840971" Apr 16 20:11:52.170066 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.169994 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7px7k" podStartSLOduration=3.6816409869999998 podStartE2EDuration="12.169966048s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:42.62114232 +0000 UTC m=+3.155498040" lastFinishedPulling="2026-04-16 20:11:51.109467385 +0000 UTC m=+11.643823101" observedRunningTime="2026-04-16 20:11:52.169803755 +0000 UTC m=+12.704159483" watchObservedRunningTime="2026-04-16 20:11:52.169966048 +0000 UTC m=+12.704321776" Apr 16 20:11:52.235748 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.235699 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pcr4b" podStartSLOduration=3.763730056 podStartE2EDuration="12.235685386s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:42.614048387 +0000 UTC m=+3.148404095" lastFinishedPulling="2026-04-16 20:11:51.086003702 +0000 UTC m=+11.620359425" observedRunningTime="2026-04-16 20:11:52.235663947 +0000 UTC m=+12.770019674" watchObservedRunningTime="2026-04-16 20:11:52.235685386 +0000 UTC m=+12.770041098" Apr 16 20:11:52.236103 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:52.236075 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j59fj" podStartSLOduration=3.774297483 podStartE2EDuration="12.236068842s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:42.618510505 +0000 UTC m=+3.152866211" lastFinishedPulling="2026-04-16 20:11:51.080281853 +0000 UTC m=+11.614637570" observedRunningTime="2026-04-16 20:11:52.192106432 +0000 UTC m=+12.726462157" watchObservedRunningTime="2026-04-16 20:11:52.236068842 +0000 UTC m=+12.770424567" Apr 16 20:11:53.139176 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:53.139138 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sdjtv" event={"ID":"74366810-9860-49b4-8e6b-049f12b47689","Type":"ContainerStarted","Data":"e985aef685aa621bc939de17b2a5e901149a7d6a647d111f8a8b6737ab43b0b9"} Apr 16 20:11:53.152734 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:53.152691 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-sdjtv" podStartSLOduration=4.688621025 podStartE2EDuration="13.152677232s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:42.6211724 +0000 UTC m=+3.155528104" lastFinishedPulling="2026-04-16 20:11:51.085228598 +0000 UTC m=+11.619584311" observedRunningTime="2026-04-16 20:11:53.152653832 +0000 UTC m=+13.687009558" watchObservedRunningTime="2026-04-16 20:11:53.152677232 +0000 UTC m=+13.687032974" Apr 16 20:11:54.055182 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:54.055138 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:54.055366 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:54.055138 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:54.055366 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:54.055278 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:11:54.055366 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:54.055147 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:54.055366 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:54.055345 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:11:54.055556 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:54.055451 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:11:55.144165 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:55.144084 2576 generic.go:358] "Generic (PLEG): container finished" podID="74b468ec-61b6-4b33-96b9-598cc8545771" containerID="bdbedcaf2dd7c5e7c7464fc6811421c9168f11626f273fed41ce08b97c8e212c" exitCode=0 Apr 16 20:11:55.144165 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:55.144127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xztj4" event={"ID":"74b468ec-61b6-4b33-96b9-598cc8545771","Type":"ContainerDied","Data":"bdbedcaf2dd7c5e7c7464fc6811421c9168f11626f273fed41ce08b97c8e212c"} Apr 16 20:11:55.933948 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:55.933913 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7px7k" Apr 16 20:11:55.934530 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:55.934509 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7px7k" Apr 16 20:11:56.054924 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:56.054894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:56.055122 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:56.054894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:56.055122 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:56.055036 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:11:56.055122 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:56.054893 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:56.055122 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:56.055103 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:11:56.055317 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:56.055155 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:11:57.662813 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:57.662783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:57.663327 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:57.662945 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:57.663327 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:57.663022 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs podName:d6bfc3ff-419c-47de-880a-e6eeafc7247a nodeName:}" failed. No retries permitted until 2026-04-16 20:12:13.663003382 +0000 UTC m=+34.197359092 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs") pod "network-metrics-daemon-q46p8" (UID: "d6bfc3ff-419c-47de-880a-e6eeafc7247a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:57.763813 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:57.763782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctsdd\" (UniqueName: \"kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd\") pod \"network-check-target-54bk5\" (UID: \"958053a3-75a0-41f2-8f2b-9790c4c625d8\") " pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:57.764004 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:57.763844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:57.764004 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:57.763956 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:57.764004 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:57.764003 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:57.764144 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:57.764013 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ctsdd for pod openshift-network-diagnostics/network-check-target-54bk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:57.764144 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:57.764023 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:57.764144 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:57.764061 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd podName:958053a3-75a0-41f2-8f2b-9790c4c625d8 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:13.764047013 +0000 UTC m=+34.298402718 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ctsdd" (UniqueName: "kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd") pod "network-check-target-54bk5" (UID: "958053a3-75a0-41f2-8f2b-9790c4c625d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:57.764144 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:57.764078 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret podName:299edf22-1b19-43c6-aa15-8124389d617a nodeName:}" failed. No retries permitted until 2026-04-16 20:12:13.7640696 +0000 UTC m=+34.298425305 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret") pod "global-pull-secret-syncer-gj7ch" (UID: "299edf22-1b19-43c6-aa15-8124389d617a") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:58.054812 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:58.054702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:11:58.054958 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:58.054711 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:11:58.054958 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:58.054817 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:11:58.054958 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:58.054852 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:11:58.055128 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:58.054963 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:11:58.055128 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:11:58.055063 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:11:59.837348 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:59.837230 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:11:59.978013 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:59.977728 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:11:59.83734391Z","UUID":"1e6cef2c-d1eb-4b0d-b248-a27e192c0dbf","Handler":null,"Name":"","Endpoint":""} Apr 16 20:11:59.981314 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:59.981292 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:11:59.981417 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:11:59.981323 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:12:00.055558 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:00.055539 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:00.055637 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:00.055622 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:12:00.055674 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:00.055664 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:12:00.055727 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:00.055712 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:12:00.055762 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:00.055749 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:00.055814 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:00.055801 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:12:00.153857 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:00.153824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmxqh" event={"ID":"fd63426d-fc95-432e-ad91-1498d43b0e04","Type":"ContainerStarted","Data":"817d16e3eea4dbc301cbc670fd4691c4ee55594755edab4318bd004f3bfd2058"} Apr 16 20:12:00.156485 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:00.156459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" event={"ID":"13dc8612-6ea4-453f-8d47-52023be79bf8","Type":"ContainerStarted","Data":"4033e802178b6ebc6b6a571db2c32fc90995a060ca794e6bbfa4a4c80eaf4639"} Apr 16 20:12:00.156582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:00.156494 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" event={"ID":"13dc8612-6ea4-453f-8d47-52023be79bf8","Type":"ContainerStarted","Data":"aea8824888d23d098904ebe920377b0cc8fe9a119ac107e531f077d19b7202a8"} Apr 16 20:12:00.156582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:00.156507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" event={"ID":"13dc8612-6ea4-453f-8d47-52023be79bf8","Type":"ContainerStarted","Data":"d653e34c803522306331a98a276257481f32c7d179938d6cb340ca79d3b1a989"} Apr 16 20:12:00.156582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:00.156518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" event={"ID":"13dc8612-6ea4-453f-8d47-52023be79bf8","Type":"ContainerStarted","Data":"3f370a78c66f4711b2d1b58de557e2bdc73c2a3bc0cdc26517f48c67d19766b9"} Apr 16 20:12:00.156582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:00.156529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" event={"ID":"13dc8612-6ea4-453f-8d47-52023be79bf8","Type":"ContainerStarted","Data":"a3769d94d4fd8d25e30d3542bc115edcc29b26403642fdf94b44ae14332adc72"} Apr 16 20:12:00.158371 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:00.158350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" event={"ID":"1c8b43bd-c782-4cdf-bbc1-459c65b0298b","Type":"ContainerStarted","Data":"7d7e6ac3b182fd3e9b150fd8542515e4601c09c044eae9451b891986e35fa8e9"} Apr 16 20:12:00.167447 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:00.167404 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cmxqh" podStartSLOduration=3.208629288 podStartE2EDuration="20.167393256s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:42.615177064 +0000 UTC m=+3.149532768" lastFinishedPulling="2026-04-16 20:11:59.573941023 +0000 UTC m=+20.108296736" observedRunningTime="2026-04-16 20:12:00.166797347 +0000 UTC m=+20.701153072" watchObservedRunningTime="2026-04-16 20:12:00.167393256 +0000 UTC m=+20.701748982" Apr 16 20:12:01.162760 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:01.162722 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" event={"ID":"13dc8612-6ea4-453f-8d47-52023be79bf8","Type":"ContainerStarted","Data":"2c90da3655edc2b33aae2c9f738af968743e6aba4fafab8e49b918301d91e409"} Apr 16 20:12:02.054941 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:02.054914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:02.055129 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:02.054953 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:12:02.055129 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:02.054914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:02.055129 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:02.055039 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:12:02.055282 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:02.055137 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:12:02.055282 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:02.055224 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:12:02.165324 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:02.165299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" event={"ID":"1c8b43bd-c782-4cdf-bbc1-459c65b0298b","Type":"ContainerStarted","Data":"53d3582c403f8680d855e6ec6cbb46d5fdad7b342587e54a9d7ae3a7571d55d9"} Apr 16 20:12:02.204344 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:02.204298 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgmkg" podStartSLOduration=3.605484409 podStartE2EDuration="22.204287502s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:42.620748406 +0000 UTC m=+3.155104123" lastFinishedPulling="2026-04-16 20:12:01.219551511 +0000 UTC m=+21.753907216" observedRunningTime="2026-04-16 20:12:02.20415783 +0000 UTC m=+22.738513555" watchObservedRunningTime="2026-04-16 20:12:02.204287502 +0000 UTC m=+22.738643227" Apr 16 20:12:03.169348 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:03.169317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" event={"ID":"13dc8612-6ea4-453f-8d47-52023be79bf8","Type":"ContainerStarted","Data":"934095e374a0b1f7f8db59d2ff8048e214771a3266ab34b5e1c5f766acdde239"} Apr 16 20:12:04.055222 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:04.055192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:04.055440 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:04.055193 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:04.055440 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:04.055299 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:12:04.055440 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:04.055360 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:12:04.055440 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:04.055402 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:12:04.055628 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:04.055460 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:12:05.175407 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:05.175166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" event={"ID":"13dc8612-6ea4-453f-8d47-52023be79bf8","Type":"ContainerStarted","Data":"319e16fef4cc557f252373b390abbbc1a6f19cbf9af06d3bd1a3850bdb275c9a"} Apr 16 20:12:05.175878 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:05.175431 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:12:05.176873 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:05.176850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xztj4" event={"ID":"74b468ec-61b6-4b33-96b9-598cc8545771","Type":"ContainerStarted","Data":"d02b3ad8277a29c71243a6261675fc8e03bedb64d472dc2fab9b46ae06cf48d9"} Apr 16 20:12:05.188746 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:05.188728 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:12:05.207256 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:05.207219 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" podStartSLOduration=8.190038729 podStartE2EDuration="25.207208142s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:42.611720614 +0000 UTC m=+3.146076332" lastFinishedPulling="2026-04-16 20:11:59.628890022 +0000 UTC m=+20.163245745" observedRunningTime="2026-04-16 20:12:05.207126581 +0000 UTC m=+25.741482310" watchObservedRunningTime="2026-04-16 20:12:05.207208142 +0000 UTC m=+25.741563868" Apr 16 20:12:06.054778 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.054745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:06.054969 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.054745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:12:06.054969 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:06.054843 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:12:06.054969 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:06.054940 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:12:06.054969 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.054745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:06.055119 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:06.055034 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:12:06.179726 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.179697 2576 generic.go:358] "Generic (PLEG): container finished" podID="74b468ec-61b6-4b33-96b9-598cc8545771" containerID="d02b3ad8277a29c71243a6261675fc8e03bedb64d472dc2fab9b46ae06cf48d9" exitCode=0 Apr 16 20:12:06.180120 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.179773 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xztj4" event={"ID":"74b468ec-61b6-4b33-96b9-598cc8545771","Type":"ContainerDied","Data":"d02b3ad8277a29c71243a6261675fc8e03bedb64d472dc2fab9b46ae06cf48d9"} Apr 16 20:12:06.180238 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.180224 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:12:06.180275 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.180246 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:12:06.193669 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.193647 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:12:06.688627 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.688595 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-54bk5"] Apr 16 20:12:06.688778 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.688731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:06.688878 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:06.688850 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:12:06.691771 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.691737 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q46p8"] Apr 16 20:12:06.691873 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.691832 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:12:06.691937 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:06.691904 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:12:06.694722 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.694699 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gj7ch"] Apr 16 20:12:06.694823 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:06.694785 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:06.694866 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:06.694848 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:12:08.054454 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:08.054429 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:08.054454 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:08.054442 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:08.054851 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:08.054453 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:12:08.054851 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:08.054525 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:12:08.054851 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:08.054589 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:12:08.054851 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:08.054642 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:12:08.184607 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:08.184584 2576 generic.go:358] "Generic (PLEG): container finished" podID="74b468ec-61b6-4b33-96b9-598cc8545771" containerID="6b4c4f8cd652a2dda621abc59dd050db9f933b2ede5c0576f15a065184db521e" exitCode=0 Apr 16 20:12:08.184711 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:08.184653 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xztj4" event={"ID":"74b468ec-61b6-4b33-96b9-598cc8545771","Type":"ContainerDied","Data":"6b4c4f8cd652a2dda621abc59dd050db9f933b2ede5c0576f15a065184db521e"} Apr 16 20:12:09.188093 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:09.187895 2576 generic.go:358] "Generic (PLEG): container finished" podID="74b468ec-61b6-4b33-96b9-598cc8545771" containerID="d0ef3af1447ec78590aeb206a29535cc3adc5f40ca910c790873d8dc8089424f" exitCode=0 Apr 16 20:12:09.188093 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:09.187972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xztj4" event={"ID":"74b468ec-61b6-4b33-96b9-598cc8545771","Type":"ContainerDied","Data":"d0ef3af1447ec78590aeb206a29535cc3adc5f40ca910c790873d8dc8089424f"} Apr 16 20:12:09.401560 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:09.401538 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7px7k" Apr 16 20:12:09.401682 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:09.401667 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 20:12:09.402054 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:09.402035 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7px7k" Apr 16 20:12:10.056237 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:10.056211 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:10.057414 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:10.056616 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:12:10.057414 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:10.056685 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:12:10.057414 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:10.056821 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:12:10.057414 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:10.056829 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:10.057414 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:10.056929 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:12:12.054994 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.054956 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:12.054994 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.054974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:12:12.055537 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.054953 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:12.055537 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:12.055070 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-54bk5" podUID="958053a3-75a0-41f2-8f2b-9790c4c625d8" Apr 16 20:12:12.055537 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:12.055154 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gj7ch" podUID="299edf22-1b19-43c6-aa15-8124389d617a" Apr 16 20:12:12.055537 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:12.055263 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:12:12.343345 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.343279 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-145.ec2.internal" event="NodeReady" Apr 16 20:12:12.343480 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.343402 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:12:12.380815 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.380789 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5c99487749-lnt44"] Apr 16 20:12:12.400664 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.400261 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf"] Apr 16 20:12:12.423850 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.423821 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk"] Apr 16 20:12:12.424036 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.423948 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.424096 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.424061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:12.426934 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.426912 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:12:12.427081 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.426921 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:12:12.427731 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.427704 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:12:12.428293 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.428272 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 20:12:12.428394 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.428383 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 20:12:12.428676 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.428642 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-h42dm\"" Apr 16 20:12:12.428676 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.428650 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 20:12:12.428816 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.428679 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 20:12:12.439344 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.439324 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:12:12.448323 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.448304 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv"] Apr 16 20:12:12.448427 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.448411 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" Apr 16 20:12:12.450731 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.450715 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-sqdwp\"" Apr 16 20:12:12.451202 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.451183 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 20:12:12.464304 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.464284 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-s265g"] Apr 16 20:12:12.464426 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.464411 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.466894 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.466868 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 20:12:12.466894 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.466886 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 20:12:12.466894 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.466892 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 20:12:12.467100 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.466937 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 20:12:12.480010 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.479973 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk"] Apr 16 20:12:12.480108 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.480016 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c99487749-lnt44"] Apr 16 20:12:12.480108 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.480031 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-t66hl"] Apr 16 20:12:12.480108 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.480089 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s265g" Apr 16 20:12:12.482151 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.482132 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:12:12.482247 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.482236 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gq66p\"" Apr 16 20:12:12.482334 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.482311 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:12:12.497137 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.497119 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv"] Apr 16 20:12:12.497249 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.497144 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s265g"] Apr 16 20:12:12.497249 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.497157 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf"] Apr 16 20:12:12.497249 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.497166 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t66hl"] Apr 16 20:12:12.497400 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.497256 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:12:12.499999 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.499969 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:12:12.500153 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.500134 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:12:12.500239 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.500218 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:12:12.500389 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.500372 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-28d82\"" Apr 16 20:12:12.578310 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.578310 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578315 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9k49\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-kube-api-access-v9k49\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.578502 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-installation-pull-secrets\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.578502 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/900b92c0-dfa0-40fc-8682-a12bd5f43f14-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.578502 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578393 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:12:12.578502 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-ca-trust-extracted\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.578502 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-certificates\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.578502 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578499 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-trusted-ca\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.578761 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578515 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-bound-sa-token\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.578761 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:12.578761 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e74e41ee-dedc-43b1-9d94-7a0127aa890c-tmp\") pod \"klusterlet-addon-workmgr-5b4f969457-xxvmf\" (UID: \"e74e41ee-dedc-43b1-9d94-7a0127aa890c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:12.578761 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v7s6\" (UniqueName: \"kubernetes.io/projected/99f215f0-5679-47ef-9cb6-fee5d63d63ac-kube-api-access-5v7s6\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:12.578761 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/52e34f51-e7b1-4bdc-a095-4ff3b00d61be-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fb9b69659-24nkk\" (UID: \"52e34f51-e7b1-4bdc-a095-4ff3b00d61be\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" Apr 16 20:12:12.578761 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7xqk\" (UniqueName: \"kubernetes.io/projected/52e34f51-e7b1-4bdc-a095-4ff3b00d61be-kube-api-access-g7xqk\") pod \"managed-serviceaccount-addon-agent-fb9b69659-24nkk\" (UID: \"52e34f51-e7b1-4bdc-a095-4ff3b00d61be\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" Apr 16 20:12:12.578761 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/900b92c0-dfa0-40fc-8682-a12bd5f43f14-ca\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.578761 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/900b92c0-dfa0-40fc-8682-a12bd5f43f14-hub\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.578761 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e74e41ee-dedc-43b1-9d94-7a0127aa890c-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b4f969457-xxvmf\" (UID: \"e74e41ee-dedc-43b1-9d94-7a0127aa890c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:12.579174 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578812 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/900b92c0-dfa0-40fc-8682-a12bd5f43f14-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.579174 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578839 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99f215f0-5679-47ef-9cb6-fee5d63d63ac-config-volume\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:12.579174 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jppmt\" (UniqueName: \"kubernetes.io/projected/e74e41ee-dedc-43b1-9d94-7a0127aa890c-kube-api-access-jppmt\") pod \"klusterlet-addon-workmgr-5b4f969457-xxvmf\" (UID: \"e74e41ee-dedc-43b1-9d94-7a0127aa890c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:12.579174 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-image-registry-private-configuration\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.579174 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.578929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/900b92c0-dfa0-40fc-8682-a12bd5f43f14-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.579174 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.579005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jphfx\" (UniqueName: \"kubernetes.io/projected/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-kube-api-access-jphfx\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:12:12.579174 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.579026 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbpfr\" (UniqueName: \"kubernetes.io/projected/900b92c0-dfa0-40fc-8682-a12bd5f43f14-kube-api-access-gbpfr\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.579174 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.579052 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99f215f0-5679-47ef-9cb6-fee5d63d63ac-tmp-dir\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:12.679621 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7xqk\" (UniqueName: \"kubernetes.io/projected/52e34f51-e7b1-4bdc-a095-4ff3b00d61be-kube-api-access-g7xqk\") pod \"managed-serviceaccount-addon-agent-fb9b69659-24nkk\" (UID: \"52e34f51-e7b1-4bdc-a095-4ff3b00d61be\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" Apr 16 20:12:12.679621 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/900b92c0-dfa0-40fc-8682-a12bd5f43f14-ca\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.679789 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679622 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/900b92c0-dfa0-40fc-8682-a12bd5f43f14-hub\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.679789 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e74e41ee-dedc-43b1-9d94-7a0127aa890c-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b4f969457-xxvmf\" (UID: \"e74e41ee-dedc-43b1-9d94-7a0127aa890c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:12.679789 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/900b92c0-dfa0-40fc-8682-a12bd5f43f14-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.679789 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99f215f0-5679-47ef-9cb6-fee5d63d63ac-config-volume\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:12.679927 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jppmt\" (UniqueName: \"kubernetes.io/projected/e74e41ee-dedc-43b1-9d94-7a0127aa890c-kube-api-access-jppmt\") pod \"klusterlet-addon-workmgr-5b4f969457-xxvmf\" (UID: \"e74e41ee-dedc-43b1-9d94-7a0127aa890c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:12.679927 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-image-registry-private-configuration\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.679927 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/900b92c0-dfa0-40fc-8682-a12bd5f43f14-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.679927 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jphfx\" (UniqueName: \"kubernetes.io/projected/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-kube-api-access-jphfx\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:12:12.679927 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbpfr\" (UniqueName: \"kubernetes.io/projected/900b92c0-dfa0-40fc-8682-a12bd5f43f14-kube-api-access-gbpfr\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.680087 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99f215f0-5679-47ef-9cb6-fee5d63d63ac-tmp-dir\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:12.680087 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.680087 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.679988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9k49\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-kube-api-access-v9k49\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.680087 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-installation-pull-secrets\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.680087 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/900b92c0-dfa0-40fc-8682-a12bd5f43f14-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.680087 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:12:12.680087 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-ca-trust-extracted\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.680087 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-certificates\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.680087 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-trusted-ca\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.680361 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-bound-sa-token\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.680361 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:12.680361 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e74e41ee-dedc-43b1-9d94-7a0127aa890c-tmp\") pod \"klusterlet-addon-workmgr-5b4f969457-xxvmf\" (UID: \"e74e41ee-dedc-43b1-9d94-7a0127aa890c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:12.680361 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5v7s6\" (UniqueName: \"kubernetes.io/projected/99f215f0-5679-47ef-9cb6-fee5d63d63ac-kube-api-access-5v7s6\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:12.680361 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680171 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/52e34f51-e7b1-4bdc-a095-4ff3b00d61be-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fb9b69659-24nkk\" (UID: \"52e34f51-e7b1-4bdc-a095-4ff3b00d61be\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" Apr 16 20:12:12.680847 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/900b92c0-dfa0-40fc-8682-a12bd5f43f14-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.680847 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:12.680777 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:12.680847 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:12.680793 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c99487749-lnt44: secret "image-registry-tls" not found Apr 16 20:12:12.681021 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:12.680856 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls podName:c0fd6853-ffb5-4eab-9ba0-39451aeb636a nodeName:}" failed. No retries permitted until 2026-04-16 20:12:13.180838605 +0000 UTC m=+33.715194330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls") pod "image-registry-5c99487749-lnt44" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a") : secret "image-registry-tls" not found Apr 16 20:12:12.681630 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.681605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99f215f0-5679-47ef-9cb6-fee5d63d63ac-tmp-dir\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:12.681729 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.680787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99f215f0-5679-47ef-9cb6-fee5d63d63ac-config-volume\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:12.682554 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.682528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-trusted-ca\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.682644 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.682622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-ca-trust-extracted\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.682703 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:12.682627 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:12.682751 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:12.682709 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls podName:99f215f0-5679-47ef-9cb6-fee5d63d63ac nodeName:}" failed. No retries permitted until 2026-04-16 20:12:13.182691564 +0000 UTC m=+33.717047275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls") pod "dns-default-s265g" (UID: "99f215f0-5679-47ef-9cb6-fee5d63d63ac") : secret "dns-default-metrics-tls" not found Apr 16 20:12:12.682812 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:12.682780 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:12.682859 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:12.682814 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert podName:ae1e5364-bc6b-4206-89ef-2aeef8ddd13c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:13.182801693 +0000 UTC m=+33.717157411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert") pod "ingress-canary-t66hl" (UID: "ae1e5364-bc6b-4206-89ef-2aeef8ddd13c") : secret "canary-serving-cert" not found Apr 16 20:12:12.685379 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.685305 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-image-registry-private-configuration\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.685481 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.685429 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/900b92c0-dfa0-40fc-8682-a12bd5f43f14-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.685573 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.685552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/900b92c0-dfa0-40fc-8682-a12bd5f43f14-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.685963 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.685924 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e74e41ee-dedc-43b1-9d94-7a0127aa890c-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b4f969457-xxvmf\" (UID: \"e74e41ee-dedc-43b1-9d94-7a0127aa890c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:12.686317 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.686296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-installation-pull-secrets\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.689699 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.689606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e74e41ee-dedc-43b1-9d94-7a0127aa890c-tmp\") pod \"klusterlet-addon-workmgr-5b4f969457-xxvmf\" (UID: \"e74e41ee-dedc-43b1-9d94-7a0127aa890c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:12.690216 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.690177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9k49\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-kube-api-access-v9k49\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.690615 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.690557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-certificates\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.690697 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.690657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v7s6\" (UniqueName: \"kubernetes.io/projected/99f215f0-5679-47ef-9cb6-fee5d63d63ac-kube-api-access-5v7s6\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:12.691134 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.691082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jppmt\" (UniqueName: \"kubernetes.io/projected/e74e41ee-dedc-43b1-9d94-7a0127aa890c-kube-api-access-jppmt\") pod \"klusterlet-addon-workmgr-5b4f969457-xxvmf\" (UID: \"e74e41ee-dedc-43b1-9d94-7a0127aa890c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:12.691134 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.691122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jphfx\" (UniqueName: \"kubernetes.io/projected/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-kube-api-access-jphfx\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:12:12.692088 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.692066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-bound-sa-token\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:12.692842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.692646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/900b92c0-dfa0-40fc-8682-a12bd5f43f14-ca\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.693103 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.693063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/52e34f51-e7b1-4bdc-a095-4ff3b00d61be-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fb9b69659-24nkk\" (UID: \"52e34f51-e7b1-4bdc-a095-4ff3b00d61be\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" Apr 16 20:12:12.693343 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.693269 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/900b92c0-dfa0-40fc-8682-a12bd5f43f14-hub\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.694874 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.694850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7xqk\" (UniqueName: \"kubernetes.io/projected/52e34f51-e7b1-4bdc-a095-4ff3b00d61be-kube-api-access-g7xqk\") pod \"managed-serviceaccount-addon-agent-fb9b69659-24nkk\" (UID: \"52e34f51-e7b1-4bdc-a095-4ff3b00d61be\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" Apr 16 20:12:12.695359 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.695340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbpfr\" (UniqueName: \"kubernetes.io/projected/900b92c0-dfa0-40fc-8682-a12bd5f43f14-kube-api-access-gbpfr\") pod \"cluster-proxy-proxy-agent-59965559f7-nqpnv\" (UID: \"900b92c0-dfa0-40fc-8682-a12bd5f43f14\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.737162 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.737142 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:12.767874 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.767853 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" Apr 16 20:12:12.775899 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.775879 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:12:12.879402 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:12.879194 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf"] Apr 16 20:12:13.183335 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:13.183307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:13.183916 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:13.183349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:12:13.183916 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:13.183370 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:13.183916 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.183469 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:13.183916 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.183496 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c99487749-lnt44: secret "image-registry-tls" not found Apr 16 20:12:13.183916 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.183468 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:13.183916 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.183575 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls podName:c0fd6853-ffb5-4eab-9ba0-39451aeb636a nodeName:}" failed. No retries permitted until 2026-04-16 20:12:14.183536164 +0000 UTC m=+34.717891872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls") pod "image-registry-5c99487749-lnt44" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a") : secret "image-registry-tls" not found Apr 16 20:12:13.183916 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.183623 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert podName:ae1e5364-bc6b-4206-89ef-2aeef8ddd13c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:14.183610736 +0000 UTC m=+34.717966445 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert") pod "ingress-canary-t66hl" (UID: "ae1e5364-bc6b-4206-89ef-2aeef8ddd13c") : secret "canary-serving-cert" not found Apr 16 20:12:13.183916 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.183473 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:13.183916 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.183651 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls podName:99f215f0-5679-47ef-9cb6-fee5d63d63ac nodeName:}" failed. No retries permitted until 2026-04-16 20:12:14.183644869 +0000 UTC m=+34.718000573 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls") pod "dns-default-s265g" (UID: "99f215f0-5679-47ef-9cb6-fee5d63d63ac") : secret "dns-default-metrics-tls" not found Apr 16 20:12:13.688025 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:13.687967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:12:13.688193 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.688107 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:13.688193 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.688175 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs podName:d6bfc3ff-419c-47de-880a-e6eeafc7247a nodeName:}" failed. No retries permitted until 2026-04-16 20:12:45.688160266 +0000 UTC m=+66.222515972 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs") pod "network-metrics-daemon-q46p8" (UID: "d6bfc3ff-419c-47de-880a-e6eeafc7247a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:13.788620 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:13.788586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:13.788791 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:13.788697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctsdd\" (UniqueName: \"kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd\") pod \"network-check-target-54bk5\" (UID: \"958053a3-75a0-41f2-8f2b-9790c4c625d8\") " pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:13.788791 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.788736 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:13.788897 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.788802 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret podName:299edf22-1b19-43c6-aa15-8124389d617a nodeName:}" failed. No retries permitted until 2026-04-16 20:12:45.788782967 +0000 UTC m=+66.323138683 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret") pod "global-pull-secret-syncer-gj7ch" (UID: "299edf22-1b19-43c6-aa15-8124389d617a") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:13.788897 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.788847 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:13.788897 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.788873 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:13.788897 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.788888 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ctsdd for pod openshift-network-diagnostics/network-check-target-54bk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:13.789118 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:13.788937 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd podName:958053a3-75a0-41f2-8f2b-9790c4c625d8 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:45.78892527 +0000 UTC m=+66.323280990 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ctsdd" (UniqueName: "kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd") pod "network-check-target-54bk5" (UID: "958053a3-75a0-41f2-8f2b-9790c4c625d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:14.055262 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:14.055228 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:14.055438 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:14.055228 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:12:14.055791 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:14.055228 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:14.057928 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:14.057904 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:14.058093 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:14.058024 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:12:14.058093 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:14.058049 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:12:14.058207 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:14.058050 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:12:14.059053 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:14.059036 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fgc2s\"" Apr 16 20:12:14.059160 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:14.059061 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8662n\"" Apr 16 20:12:14.192962 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:14.192929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:14.193396 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:14.192972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:12:14.193396 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:14.193020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:14.193396 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:14.193128 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:14.193396 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:14.193146 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:14.193396 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:14.193160 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c99487749-lnt44: secret "image-registry-tls" not found Apr 16 20:12:14.193396 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:14.193130 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:14.193396 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:14.193202 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert podName:ae1e5364-bc6b-4206-89ef-2aeef8ddd13c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:16.193181356 +0000 UTC m=+36.727537065 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert") pod "ingress-canary-t66hl" (UID: "ae1e5364-bc6b-4206-89ef-2aeef8ddd13c") : secret "canary-serving-cert" not found Apr 16 20:12:14.193396 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:14.193223 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls podName:c0fd6853-ffb5-4eab-9ba0-39451aeb636a nodeName:}" failed. No retries permitted until 2026-04-16 20:12:16.19321166 +0000 UTC m=+36.727567365 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls") pod "image-registry-5c99487749-lnt44" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a") : secret "image-registry-tls" not found Apr 16 20:12:14.193396 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:14.193239 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls podName:99f215f0-5679-47ef-9cb6-fee5d63d63ac nodeName:}" failed. No retries permitted until 2026-04-16 20:12:16.193230732 +0000 UTC m=+36.727586438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls") pod "dns-default-s265g" (UID: "99f215f0-5679-47ef-9cb6-fee5d63d63ac") : secret "dns-default-metrics-tls" not found Apr 16 20:12:14.880108 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:12:14.880073 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode74e41ee_dedc_43b1_9d94_7a0127aa890c.slice/crio-bca9b3cf1fdd3b569b0073f00607f41c251b1f2874273c847d6b104f53a80764 WatchSource:0}: Error finding container bca9b3cf1fdd3b569b0073f00607f41c251b1f2874273c847d6b104f53a80764: Status 404 returned error can't find the container with id bca9b3cf1fdd3b569b0073f00607f41c251b1f2874273c847d6b104f53a80764 Apr 16 20:12:15.024074 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:15.024048 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk"] Apr 16 20:12:15.027744 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:12:15.027720 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52e34f51_e7b1_4bdc_a095_4ff3b00d61be.slice/crio-1fcd673454081bd622f6d7674b626bd4fed35ae40a4ee37dad10670ac07a0db1 WatchSource:0}: Error finding container 1fcd673454081bd622f6d7674b626bd4fed35ae40a4ee37dad10670ac07a0db1: Status 404 returned error can't find the container with id 1fcd673454081bd622f6d7674b626bd4fed35ae40a4ee37dad10670ac07a0db1 Apr 16 20:12:15.030659 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:15.030633 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv"] Apr 16 20:12:15.033350 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:12:15.033328 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod900b92c0_dfa0_40fc_8682_a12bd5f43f14.slice/crio-29c68b9c5d894791d297efd00418e4451805a26d23a29eb8a8c5ac8eb9b41401 WatchSource:0}: Error finding container 29c68b9c5d894791d297efd00418e4451805a26d23a29eb8a8c5ac8eb9b41401: Status 404 returned error can't find the container with id 29c68b9c5d894791d297efd00418e4451805a26d23a29eb8a8c5ac8eb9b41401 Apr 16 20:12:15.202957 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:15.200170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" event={"ID":"52e34f51-e7b1-4bdc-a095-4ff3b00d61be","Type":"ContainerStarted","Data":"1fcd673454081bd622f6d7674b626bd4fed35ae40a4ee37dad10670ac07a0db1"} Apr 16 20:12:15.204315 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:15.204285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" event={"ID":"e74e41ee-dedc-43b1-9d94-7a0127aa890c","Type":"ContainerStarted","Data":"bca9b3cf1fdd3b569b0073f00607f41c251b1f2874273c847d6b104f53a80764"} Apr 16 20:12:15.205180 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:15.205158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" event={"ID":"900b92c0-dfa0-40fc-8682-a12bd5f43f14","Type":"ContainerStarted","Data":"29c68b9c5d894791d297efd00418e4451805a26d23a29eb8a8c5ac8eb9b41401"} Apr 16 20:12:16.214087 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:16.214047 2576 generic.go:358] "Generic (PLEG): container finished" podID="74b468ec-61b6-4b33-96b9-598cc8545771" containerID="b39ad469fda60a505b3888ef19039df202a6d380d113ff3caf854543cd3fc382" exitCode=0 Apr 16 20:12:16.214524 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:16.214127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xztj4" event={"ID":"74b468ec-61b6-4b33-96b9-598cc8545771","Type":"ContainerDied","Data":"b39ad469fda60a505b3888ef19039df202a6d380d113ff3caf854543cd3fc382"} Apr 16 20:12:16.215653 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:16.215293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:16.215653 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:16.215335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:12:16.215653 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:16.215365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:16.215653 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:16.215441 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:16.215653 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:16.215460 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c99487749-lnt44: secret "image-registry-tls" not found Apr 16 20:12:16.215653 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:16.215499 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:16.215653 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:16.215512 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls podName:c0fd6853-ffb5-4eab-9ba0-39451aeb636a nodeName:}" failed. No retries permitted until 2026-04-16 20:12:20.215493368 +0000 UTC m=+40.749849077 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls") pod "image-registry-5c99487749-lnt44" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a") : secret "image-registry-tls" not found Apr 16 20:12:16.215653 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:16.215544 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls podName:99f215f0-5679-47ef-9cb6-fee5d63d63ac nodeName:}" failed. No retries permitted until 2026-04-16 20:12:20.215529237 +0000 UTC m=+40.749884942 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls") pod "dns-default-s265g" (UID: "99f215f0-5679-47ef-9cb6-fee5d63d63ac") : secret "dns-default-metrics-tls" not found Apr 16 20:12:16.215653 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:16.215603 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:16.215653 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:16.215633 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert podName:ae1e5364-bc6b-4206-89ef-2aeef8ddd13c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:20.215621779 +0000 UTC m=+40.749977484 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert") pod "ingress-canary-t66hl" (UID: "ae1e5364-bc6b-4206-89ef-2aeef8ddd13c") : secret "canary-serving-cert" not found Apr 16 20:12:17.221662 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:17.221247 2576 generic.go:358] "Generic (PLEG): container finished" podID="74b468ec-61b6-4b33-96b9-598cc8545771" containerID="36b57d92195238e168174ac57b06b1f51272965676f18156f6dddc309a356d53" exitCode=0 Apr 16 20:12:17.221662 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:17.221387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xztj4" event={"ID":"74b468ec-61b6-4b33-96b9-598cc8545771","Type":"ContainerDied","Data":"36b57d92195238e168174ac57b06b1f51272965676f18156f6dddc309a356d53"} Apr 16 20:12:20.228870 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:20.228674 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" event={"ID":"e74e41ee-dedc-43b1-9d94-7a0127aa890c","Type":"ContainerStarted","Data":"4a66d28646d965755d0ac249a86ea69a5e4b5aded442afcaa557d98722de496c"} Apr 16 20:12:20.229367 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:20.228897 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:20.230085 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:20.230061 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" event={"ID":"900b92c0-dfa0-40fc-8682-a12bd5f43f14","Type":"ContainerStarted","Data":"87dab7d2e339a566b428e721c555568d4b06772460fd1ba9856d310b9f60c64e"} Apr 16 20:12:20.230320 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:20.230287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:12:20.231325 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:20.231301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" event={"ID":"52e34f51-e7b1-4bdc-a095-4ff3b00d61be","Type":"ContainerStarted","Data":"3bc9cad7badb88a9f6e60d1328fe9d658da4116fddc97c201021f836113b4f31"} Apr 16 20:12:20.233953 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:20.233936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xztj4" event={"ID":"74b468ec-61b6-4b33-96b9-598cc8545771","Type":"ContainerStarted","Data":"35bbefe4aab156b67b2620a2aabb8a21a71ce36e698577b5a9a46d47c887af3f"} Apr 16 20:12:20.246961 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:20.246926 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" podStartSLOduration=8.187143272 podStartE2EDuration="13.246916015s" podCreationTimestamp="2026-04-16 20:12:07 +0000 UTC" firstStartedPulling="2026-04-16 20:12:14.883051352 +0000 UTC m=+35.417407071" lastFinishedPulling="2026-04-16 20:12:19.942824104 +0000 UTC m=+40.477179814" observedRunningTime="2026-04-16 20:12:20.246098198 +0000 UTC m=+40.780453923" watchObservedRunningTime="2026-04-16 20:12:20.246916015 +0000 UTC m=+40.781271740" Apr 16 20:12:20.249274 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:20.249254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:20.249321 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:20.249288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:12:20.249321 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:20.249308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:20.249401 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:20.249389 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:20.249437 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:20.249405 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c99487749-lnt44: secret "image-registry-tls" not found Apr 16 20:12:20.249473 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:20.249446 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls podName:c0fd6853-ffb5-4eab-9ba0-39451aeb636a nodeName:}" failed. No retries permitted until 2026-04-16 20:12:28.249433949 +0000 UTC m=+48.783789656 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls") pod "image-registry-5c99487749-lnt44" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a") : secret "image-registry-tls" not found Apr 16 20:12:20.249473 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:20.249450 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:20.249563 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:20.249490 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert podName:ae1e5364-bc6b-4206-89ef-2aeef8ddd13c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:28.249478051 +0000 UTC m=+48.783833759 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert") pod "ingress-canary-t66hl" (UID: "ae1e5364-bc6b-4206-89ef-2aeef8ddd13c") : secret "canary-serving-cert" not found Apr 16 20:12:20.252300 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:20.249895 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:20.252300 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:20.249962 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls podName:99f215f0-5679-47ef-9cb6-fee5d63d63ac nodeName:}" failed. No retries permitted until 2026-04-16 20:12:28.249937877 +0000 UTC m=+48.784293584 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls") pod "dns-default-s265g" (UID: "99f215f0-5679-47ef-9cb6-fee5d63d63ac") : secret "dns-default-metrics-tls" not found Apr 16 20:12:20.273754 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:20.273714 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xztj4" podStartSLOduration=7.667365328 podStartE2EDuration="40.273699904s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:42.618008285 +0000 UTC m=+3.152364002" lastFinishedPulling="2026-04-16 20:12:15.22434287 +0000 UTC m=+35.758698578" observedRunningTime="2026-04-16 20:12:20.272183089 +0000 UTC m=+40.806538826" watchObservedRunningTime="2026-04-16 20:12:20.273699904 +0000 UTC m=+40.808055633" Apr 16 20:12:20.293788 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:20.293751 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" podStartSLOduration=8.396156669 podStartE2EDuration="13.293740691s" podCreationTimestamp="2026-04-16 20:12:07 +0000 UTC" firstStartedPulling="2026-04-16 20:12:15.02979946 +0000 UTC m=+35.564155166" lastFinishedPulling="2026-04-16 20:12:19.927383484 +0000 UTC m=+40.461739188" observedRunningTime="2026-04-16 20:12:20.292740597 +0000 UTC m=+40.827096315" watchObservedRunningTime="2026-04-16 20:12:20.293740691 +0000 UTC m=+40.828096408" Apr 16 20:12:22.239518 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:22.239490 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" event={"ID":"900b92c0-dfa0-40fc-8682-a12bd5f43f14","Type":"ContainerStarted","Data":"180b905494b86e2dd29ae19c328c2fddf6bd7c5f008fb6b6f1d1e128cabe8214"} Apr 16 20:12:22.240030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:22.239525 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" event={"ID":"900b92c0-dfa0-40fc-8682-a12bd5f43f14","Type":"ContainerStarted","Data":"526f10285481e35d4d8d97b55533b2ca210de4c3df344280db2db0db2aaf0dc9"} Apr 16 20:12:22.258540 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:22.258501 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" podStartSLOduration=8.363126871 podStartE2EDuration="15.258489454s" podCreationTimestamp="2026-04-16 20:12:07 +0000 UTC" firstStartedPulling="2026-04-16 20:12:15.034884426 +0000 UTC m=+35.569240130" lastFinishedPulling="2026-04-16 20:12:21.930247005 +0000 UTC m=+42.464602713" observedRunningTime="2026-04-16 20:12:22.256923566 +0000 UTC m=+42.791279314" watchObservedRunningTime="2026-04-16 20:12:22.258489454 +0000 UTC m=+42.792845178" Apr 16 20:12:28.305109 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:28.305071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:28.305606 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:28.305114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:12:28.305606 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:28.305200 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:28.305606 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:28.305207 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:28.305606 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:28.305216 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c99487749-lnt44: secret "image-registry-tls" not found Apr 16 20:12:28.305606 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:28.305223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:28.305606 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:28.305261 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls podName:c0fd6853-ffb5-4eab-9ba0-39451aeb636a nodeName:}" failed. No retries permitted until 2026-04-16 20:12:44.305246786 +0000 UTC m=+64.839602493 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls") pod "image-registry-5c99487749-lnt44" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a") : secret "image-registry-tls" not found Apr 16 20:12:28.305606 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:28.305295 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:28.305606 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:28.305309 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert podName:ae1e5364-bc6b-4206-89ef-2aeef8ddd13c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:44.305281184 +0000 UTC m=+64.839636891 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert") pod "ingress-canary-t66hl" (UID: "ae1e5364-bc6b-4206-89ef-2aeef8ddd13c") : secret "canary-serving-cert" not found Apr 16 20:12:28.305606 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:28.305335 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls podName:99f215f0-5679-47ef-9cb6-fee5d63d63ac nodeName:}" failed. No retries permitted until 2026-04-16 20:12:44.305327832 +0000 UTC m=+64.839683537 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls") pod "dns-default-s265g" (UID: "99f215f0-5679-47ef-9cb6-fee5d63d63ac") : secret "dns-default-metrics-tls" not found Apr 16 20:12:38.202426 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:38.202400 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4t2rt" Apr 16 20:12:44.321692 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:44.321659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:12:44.321692 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:44.321694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:12:44.322172 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:44.321783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:12:44.322172 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:44.321812 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:44.322172 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:44.321859 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:44.322172 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:44.321869 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert podName:ae1e5364-bc6b-4206-89ef-2aeef8ddd13c nodeName:}" failed. No retries permitted until 2026-04-16 20:13:16.32185013 +0000 UTC m=+96.856205843 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert") pod "ingress-canary-t66hl" (UID: "ae1e5364-bc6b-4206-89ef-2aeef8ddd13c") : secret "canary-serving-cert" not found Apr 16 20:12:44.322172 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:44.321870 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c99487749-lnt44: secret "image-registry-tls" not found Apr 16 20:12:44.322172 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:44.321872 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:44.322172 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:44.321902 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls podName:c0fd6853-ffb5-4eab-9ba0-39451aeb636a nodeName:}" failed. No retries permitted until 2026-04-16 20:13:16.321893452 +0000 UTC m=+96.856249160 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls") pod "image-registry-5c99487749-lnt44" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a") : secret "image-registry-tls" not found Apr 16 20:12:44.322172 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:44.321920 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls podName:99f215f0-5679-47ef-9cb6-fee5d63d63ac nodeName:}" failed. No retries permitted until 2026-04-16 20:13:16.321907647 +0000 UTC m=+96.856263356 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls") pod "dns-default-s265g" (UID: "99f215f0-5679-47ef-9cb6-fee5d63d63ac") : secret "dns-default-metrics-tls" not found Apr 16 20:12:45.730955 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.730916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:12:45.733297 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.733282 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:45.741123 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:45.741098 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:12:45.741224 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:12:45.741152 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs podName:d6bfc3ff-419c-47de-880a-e6eeafc7247a nodeName:}" failed. No retries permitted until 2026-04-16 20:13:49.741137539 +0000 UTC m=+130.275493242 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs") pod "network-metrics-daemon-q46p8" (UID: "d6bfc3ff-419c-47de-880a-e6eeafc7247a") : secret "metrics-daemon-secret" not found Apr 16 20:12:45.831570 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.831549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:45.831638 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.831615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctsdd\" (UniqueName: \"kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd\") pod \"network-check-target-54bk5\" (UID: \"958053a3-75a0-41f2-8f2b-9790c4c625d8\") " pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:45.833826 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.833806 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:12:45.833952 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.833938 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:12:45.844088 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.844069 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:12:45.845575 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.845555 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/299edf22-1b19-43c6-aa15-8124389d617a-original-pull-secret\") pod \"global-pull-secret-syncer-gj7ch\" (UID: \"299edf22-1b19-43c6-aa15-8124389d617a\") " pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:45.855067 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.855041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctsdd\" (UniqueName: \"kubernetes.io/projected/958053a3-75a0-41f2-8f2b-9790c4c625d8-kube-api-access-ctsdd\") pod \"network-check-target-54bk5\" (UID: \"958053a3-75a0-41f2-8f2b-9790c4c625d8\") " pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:45.867813 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.867793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gj7ch" Apr 16 20:12:45.881834 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.881814 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fgc2s\"" Apr 16 20:12:45.889782 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.889760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:45.985225 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:45.985169 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gj7ch"] Apr 16 20:12:45.988446 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:12:45.988418 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod299edf22_1b19_43c6_aa15_8124389d617a.slice/crio-9d5091275130d4768fc32c72653eab30aa9cb41291a946c6d705fe8a759f534c WatchSource:0}: Error finding container 9d5091275130d4768fc32c72653eab30aa9cb41291a946c6d705fe8a759f534c: Status 404 returned error can't find the container with id 9d5091275130d4768fc32c72653eab30aa9cb41291a946c6d705fe8a759f534c Apr 16 20:12:46.007042 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:46.007001 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-54bk5"] Apr 16 20:12:46.009105 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:12:46.009083 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod958053a3_75a0_41f2_8f2b_9790c4c625d8.slice/crio-f8c2e3870f18106d5944cdcf266668edc05e8bfb34423b538f3f819372894065 WatchSource:0}: Error finding container f8c2e3870f18106d5944cdcf266668edc05e8bfb34423b538f3f819372894065: Status 404 returned error can't find the container with id f8c2e3870f18106d5944cdcf266668edc05e8bfb34423b538f3f819372894065 Apr 16 20:12:46.282166 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:46.282103 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gj7ch" event={"ID":"299edf22-1b19-43c6-aa15-8124389d617a","Type":"ContainerStarted","Data":"9d5091275130d4768fc32c72653eab30aa9cb41291a946c6d705fe8a759f534c"} Apr 16 20:12:46.283017 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:46.282975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-54bk5" event={"ID":"958053a3-75a0-41f2-8f2b-9790c4c625d8","Type":"ContainerStarted","Data":"f8c2e3870f18106d5944cdcf266668edc05e8bfb34423b538f3f819372894065"} Apr 16 20:12:49.291504 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:49.291468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-54bk5" event={"ID":"958053a3-75a0-41f2-8f2b-9790c4c625d8","Type":"ContainerStarted","Data":"5b5b3349bd0614a5096dfbee627b760515626ac3dd466e56781c78f06d567484"} Apr 16 20:12:49.291922 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:49.291599 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:12:49.307503 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:49.307462 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-54bk5" podStartSLOduration=66.382793132 podStartE2EDuration="1m9.307449751s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:12:46.010705548 +0000 UTC m=+66.545061252" lastFinishedPulling="2026-04-16 20:12:48.935362152 +0000 UTC m=+69.469717871" observedRunningTime="2026-04-16 20:12:49.306222462 +0000 UTC m=+69.840578202" watchObservedRunningTime="2026-04-16 20:12:49.307449751 +0000 UTC m=+69.841805477" Apr 16 20:12:51.296699 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:51.296581 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gj7ch" event={"ID":"299edf22-1b19-43c6-aa15-8124389d617a","Type":"ContainerStarted","Data":"b914e63351706303871f76ff7d7ad1dcdb354235003d5dc70859d749b2712fe7"} Apr 16 20:12:51.311871 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:12:51.311831 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gj7ch" podStartSLOduration=65.98401819 podStartE2EDuration="1m10.311819636s" podCreationTimestamp="2026-04-16 20:11:41 +0000 UTC" firstStartedPulling="2026-04-16 20:12:45.989908793 +0000 UTC m=+66.524264498" lastFinishedPulling="2026-04-16 20:12:50.317710239 +0000 UTC m=+70.852065944" observedRunningTime="2026-04-16 20:12:51.311664085 +0000 UTC m=+71.846019810" watchObservedRunningTime="2026-04-16 20:12:51.311819636 +0000 UTC m=+71.846175362" Apr 16 20:13:16.340092 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:13:16.340067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:13:16.340390 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:13:16.340108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:13:16.340390 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:13:16.340126 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:13:16.340390 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:13:16.340210 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:13:16.340390 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:13:16.340215 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:13:16.340390 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:13:16.340265 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls podName:99f215f0-5679-47ef-9cb6-fee5d63d63ac nodeName:}" failed. No retries permitted until 2026-04-16 20:14:20.340246204 +0000 UTC m=+160.874601911 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls") pod "dns-default-s265g" (UID: "99f215f0-5679-47ef-9cb6-fee5d63d63ac") : secret "dns-default-metrics-tls" not found Apr 16 20:13:16.340390 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:13:16.340217 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:13:16.340390 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:13:16.340278 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert podName:ae1e5364-bc6b-4206-89ef-2aeef8ddd13c nodeName:}" failed. No retries permitted until 2026-04-16 20:14:20.340272069 +0000 UTC m=+160.874627774 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert") pod "ingress-canary-t66hl" (UID: "ae1e5364-bc6b-4206-89ef-2aeef8ddd13c") : secret "canary-serving-cert" not found Apr 16 20:13:16.340390 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:13:16.340278 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c99487749-lnt44: secret "image-registry-tls" not found Apr 16 20:13:16.340390 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:13:16.340311 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls podName:c0fd6853-ffb5-4eab-9ba0-39451aeb636a nodeName:}" failed. No retries permitted until 2026-04-16 20:14:20.34029545 +0000 UTC m=+160.874651177 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls") pod "image-registry-5c99487749-lnt44" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a") : secret "image-registry-tls" not found Apr 16 20:13:20.296830 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:13:20.296801 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-54bk5" Apr 16 20:13:49.774641 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:13:49.774602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:13:49.775101 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:13:49.774709 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:13:49.775101 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:13:49.774767 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs podName:d6bfc3ff-419c-47de-880a-e6eeafc7247a nodeName:}" failed. No retries permitted until 2026-04-16 20:15:51.774753263 +0000 UTC m=+252.309108967 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs") pod "network-metrics-daemon-q46p8" (UID: "d6bfc3ff-419c-47de-880a-e6eeafc7247a") : secret "metrics-daemon-secret" not found Apr 16 20:14:06.979472 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:06.979430 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pcr4b_cf44a204-89e9-429e-a562-8e270453e2d3/dns-node-resolver/0.log" Apr 16 20:14:07.982587 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:07.982558 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j59fj_2e2f3589-a853-4295-aedb-5f577abc797b/node-ca/0.log" Apr 16 20:14:15.442548 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:14:15.442512 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5c99487749-lnt44" podUID="c0fd6853-ffb5-4eab-9ba0-39451aeb636a" Apr 16 20:14:15.480923 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:15.480894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:14:15.499371 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:14:15.499344 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-s265g" podUID="99f215f0-5679-47ef-9cb6-fee5d63d63ac" Apr 16 20:14:15.517513 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:14:15.517485 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-t66hl" podUID="ae1e5364-bc6b-4206-89ef-2aeef8ddd13c" Apr 16 20:14:16.483609 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:16.483583 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s265g" Apr 16 20:14:17.074572 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:14:17.074543 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-q46p8" podUID="d6bfc3ff-419c-47de-880a-e6eeafc7247a" Apr 16 20:14:20.229753 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.229705 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" podUID="e74e41ee-dedc-43b1-9d94-7a0127aa890c" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.10:8000/readyz\": dial tcp 10.132.0.10:8000: connect: connection refused" Apr 16 20:14:20.395847 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.395796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:14:20.395847 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.395830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:14:20.395995 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.395850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:14:20.397891 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.397866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99f215f0-5679-47ef-9cb6-fee5d63d63ac-metrics-tls\") pod \"dns-default-s265g\" (UID: \"99f215f0-5679-47ef-9cb6-fee5d63d63ac\") " pod="openshift-dns/dns-default-s265g" Apr 16 20:14:20.397968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.397956 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") pod \"image-registry-5c99487749-lnt44\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:14:20.398032 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.398004 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae1e5364-bc6b-4206-89ef-2aeef8ddd13c-cert\") pod \"ingress-canary-t66hl\" (UID: \"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c\") " pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:14:20.493581 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.493554 2576 generic.go:358] "Generic (PLEG): container finished" podID="e74e41ee-dedc-43b1-9d94-7a0127aa890c" containerID="4a66d28646d965755d0ac249a86ea69a5e4b5aded442afcaa557d98722de496c" exitCode=1 Apr 16 20:14:20.493673 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.493626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" event={"ID":"e74e41ee-dedc-43b1-9d94-7a0127aa890c","Type":"ContainerDied","Data":"4a66d28646d965755d0ac249a86ea69a5e4b5aded442afcaa557d98722de496c"} Apr 16 20:14:20.493907 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.493892 2576 scope.go:117] "RemoveContainer" containerID="4a66d28646d965755d0ac249a86ea69a5e4b5aded442afcaa557d98722de496c" Apr 16 20:14:20.496606 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.496571 2576 generic.go:358] "Generic (PLEG): container finished" podID="52e34f51-e7b1-4bdc-a095-4ff3b00d61be" containerID="3bc9cad7badb88a9f6e60d1328fe9d658da4116fddc97c201021f836113b4f31" exitCode=255 Apr 16 20:14:20.496671 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.496606 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" event={"ID":"52e34f51-e7b1-4bdc-a095-4ff3b00d61be","Type":"ContainerDied","Data":"3bc9cad7badb88a9f6e60d1328fe9d658da4116fddc97c201021f836113b4f31"} Apr 16 20:14:20.496877 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.496858 2576 scope.go:117] "RemoveContainer" containerID="3bc9cad7badb88a9f6e60d1328fe9d658da4116fddc97c201021f836113b4f31" Apr 16 20:14:20.583819 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.583794 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-h42dm\"" Apr 16 20:14:20.591785 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.591758 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:14:20.686762 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.686739 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gq66p\"" Apr 16 20:14:20.694889 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.694864 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s265g" Apr 16 20:14:20.708566 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.708521 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c99487749-lnt44"] Apr 16 20:14:20.712264 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:14:20.712238 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0fd6853_ffb5_4eab_9ba0_39451aeb636a.slice/crio-af2643ead5fe8d31e4b0919ef471333a98f55cb0a6b88318a339d96509ccd2bd WatchSource:0}: Error finding container af2643ead5fe8d31e4b0919ef471333a98f55cb0a6b88318a339d96509ccd2bd: Status 404 returned error can't find the container with id af2643ead5fe8d31e4b0919ef471333a98f55cb0a6b88318a339d96509ccd2bd Apr 16 20:14:20.807195 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:20.807166 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s265g"] Apr 16 20:14:20.810240 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:14:20.810212 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99f215f0_5679_47ef_9cb6_fee5d63d63ac.slice/crio-8da17c6e1826bd0a104a612c3cc34a4158a84ee46835fdd3c7c564d20b94e734 WatchSource:0}: Error finding container 8da17c6e1826bd0a104a612c3cc34a4158a84ee46835fdd3c7c564d20b94e734: Status 404 returned error can't find the container with id 8da17c6e1826bd0a104a612c3cc34a4158a84ee46835fdd3c7c564d20b94e734 Apr 16 20:14:21.501203 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:21.500881 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" event={"ID":"e74e41ee-dedc-43b1-9d94-7a0127aa890c","Type":"ContainerStarted","Data":"ca97d0da567a165947eef10b935b0c80cc5f4a7beb2fa1417c5b38b04cc2a95e"} Apr 16 20:14:21.501599 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:21.501207 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:14:21.501875 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:21.501855 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4f969457-xxvmf" Apr 16 20:14:21.502082 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:21.502062 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s265g" event={"ID":"99f215f0-5679-47ef-9cb6-fee5d63d63ac","Type":"ContainerStarted","Data":"8da17c6e1826bd0a104a612c3cc34a4158a84ee46835fdd3c7c564d20b94e734"} Apr 16 20:14:21.503597 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:21.503575 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb9b69659-24nkk" event={"ID":"52e34f51-e7b1-4bdc-a095-4ff3b00d61be","Type":"ContainerStarted","Data":"f9f0120079891124363a36b43c63ef9d616397306f69573c0896cc3d59c4b612"} Apr 16 20:14:21.504819 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:21.504795 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c99487749-lnt44" event={"ID":"c0fd6853-ffb5-4eab-9ba0-39451aeb636a","Type":"ContainerStarted","Data":"5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5"} Apr 16 20:14:21.504900 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:21.504825 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c99487749-lnt44" event={"ID":"c0fd6853-ffb5-4eab-9ba0-39451aeb636a","Type":"ContainerStarted","Data":"af2643ead5fe8d31e4b0919ef471333a98f55cb0a6b88318a339d96509ccd2bd"} Apr 16 20:14:21.504968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:21.504921 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:14:21.560106 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:21.560057 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5c99487749-lnt44" podStartSLOduration=149.560045284 podStartE2EDuration="2m29.560045284s" podCreationTimestamp="2026-04-16 20:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:21.559233272 +0000 UTC m=+162.093588999" watchObservedRunningTime="2026-04-16 20:14:21.560045284 +0000 UTC m=+162.094401009" Apr 16 20:14:22.509783 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:22.509746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s265g" event={"ID":"99f215f0-5679-47ef-9cb6-fee5d63d63ac","Type":"ContainerStarted","Data":"7970ec7af35cdcf04e4a3c3d31f7f8c1245cfefe34f2b33f6c2729045c7df53f"} Apr 16 20:14:22.510180 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:22.509994 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-s265g" Apr 16 20:14:22.510180 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:22.510022 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s265g" event={"ID":"99f215f0-5679-47ef-9cb6-fee5d63d63ac","Type":"ContainerStarted","Data":"f0567529eb28cf338c21e4dba250741b1aa1cdf421eb570e2f17b0707343a577"} Apr 16 20:14:22.530887 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:22.529089 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-s265g" podStartSLOduration=129.235635524 podStartE2EDuration="2m10.529065781s" podCreationTimestamp="2026-04-16 20:12:12 +0000 UTC" firstStartedPulling="2026-04-16 20:14:20.812124895 +0000 UTC m=+161.346480599" lastFinishedPulling="2026-04-16 20:14:22.105555147 +0000 UTC m=+162.639910856" observedRunningTime="2026-04-16 20:14:22.527394918 +0000 UTC m=+163.061750657" watchObservedRunningTime="2026-04-16 20:14:22.529065781 +0000 UTC m=+163.063421509" Apr 16 20:14:28.055030 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:28.054998 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:14:28.055446 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:28.055000 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:14:28.057628 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:28.057608 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-28d82\"" Apr 16 20:14:28.065651 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:28.065626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t66hl" Apr 16 20:14:28.175348 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:28.175319 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t66hl"] Apr 16 20:14:28.178692 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:14:28.178663 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae1e5364_bc6b_4206_89ef_2aeef8ddd13c.slice/crio-53ccb50a56c3a3309cbf8836714f2cf2d61f3891956fced54bfef640103d084d WatchSource:0}: Error finding container 53ccb50a56c3a3309cbf8836714f2cf2d61f3891956fced54bfef640103d084d: Status 404 returned error can't find the container with id 53ccb50a56c3a3309cbf8836714f2cf2d61f3891956fced54bfef640103d084d Apr 16 20:14:28.524787 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:28.524746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t66hl" event={"ID":"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c","Type":"ContainerStarted","Data":"53ccb50a56c3a3309cbf8836714f2cf2d61f3891956fced54bfef640103d084d"} Apr 16 20:14:29.605544 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.605514 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2k2bj"] Apr 16 20:14:29.608758 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.608733 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.611921 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.611905 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:14:29.612993 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.612966 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:14:29.613057 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.612995 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:14:29.613057 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.613043 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:14:29.613569 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.613555 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-gwklr\"" Apr 16 20:14:29.633373 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.631432 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2k2bj"] Apr 16 20:14:29.757377 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.757339 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4bb4\" (UniqueName: \"kubernetes.io/projected/cb305b65-f801-4d89-b18c-c2f5be7436c5-kube-api-access-t4bb4\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.757547 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.757421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cb305b65-f801-4d89-b18c-c2f5be7436c5-data-volume\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.757547 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.757441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cb305b65-f801-4d89-b18c-c2f5be7436c5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.757547 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.757463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cb305b65-f801-4d89-b18c-c2f5be7436c5-crio-socket\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.757547 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.757480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cb305b65-f801-4d89-b18c-c2f5be7436c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.858613 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.858520 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cb305b65-f801-4d89-b18c-c2f5be7436c5-data-volume\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.858613 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.858555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cb305b65-f801-4d89-b18c-c2f5be7436c5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.858613 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.858575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cb305b65-f801-4d89-b18c-c2f5be7436c5-crio-socket\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.858613 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.858596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cb305b65-f801-4d89-b18c-c2f5be7436c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.858888 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.858640 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4bb4\" (UniqueName: \"kubernetes.io/projected/cb305b65-f801-4d89-b18c-c2f5be7436c5-kube-api-access-t4bb4\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.858888 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.858700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cb305b65-f801-4d89-b18c-c2f5be7436c5-crio-socket\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.858950 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.858913 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cb305b65-f801-4d89-b18c-c2f5be7436c5-data-volume\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.859188 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.859162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cb305b65-f801-4d89-b18c-c2f5be7436c5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.860950 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.860929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cb305b65-f801-4d89-b18c-c2f5be7436c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.867374 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.867355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4bb4\" (UniqueName: \"kubernetes.io/projected/cb305b65-f801-4d89-b18c-c2f5be7436c5-kube-api-access-t4bb4\") pod \"insights-runtime-extractor-2k2bj\" (UID: \"cb305b65-f801-4d89-b18c-c2f5be7436c5\") " pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:29.924152 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:29.924110 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2k2bj" Apr 16 20:14:30.044180 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:30.044148 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2k2bj"] Apr 16 20:14:30.047291 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:14:30.047260 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb305b65_f801_4d89_b18c_c2f5be7436c5.slice/crio-ac390fd6bfc97b7f62cd1548f75a13d38f2f0fced9c7f00c9bed55629fa7276f WatchSource:0}: Error finding container ac390fd6bfc97b7f62cd1548f75a13d38f2f0fced9c7f00c9bed55629fa7276f: Status 404 returned error can't find the container with id ac390fd6bfc97b7f62cd1548f75a13d38f2f0fced9c7f00c9bed55629fa7276f Apr 16 20:14:30.533965 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:30.533926 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t66hl" event={"ID":"ae1e5364-bc6b-4206-89ef-2aeef8ddd13c","Type":"ContainerStarted","Data":"5105c40e7275b85fb4be76605fd5506a6d02a36b9755348c230cb8b2a57a6f54"} Apr 16 20:14:30.535178 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:30.535156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2k2bj" event={"ID":"cb305b65-f801-4d89-b18c-c2f5be7436c5","Type":"ContainerStarted","Data":"ee8e66496d74a5b33fc50d7d90752b46898053d4d1af2e5d748cb51b9befdcbf"} Apr 16 20:14:30.535285 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:30.535185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2k2bj" event={"ID":"cb305b65-f801-4d89-b18c-c2f5be7436c5","Type":"ContainerStarted","Data":"ac390fd6bfc97b7f62cd1548f75a13d38f2f0fced9c7f00c9bed55629fa7276f"} Apr 16 20:14:30.550877 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:30.550835 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-t66hl" podStartSLOduration=137.150486024 podStartE2EDuration="2m18.550817993s" podCreationTimestamp="2026-04-16 20:12:12 +0000 UTC" firstStartedPulling="2026-04-16 20:14:28.180427993 +0000 UTC m=+168.714783705" lastFinishedPulling="2026-04-16 20:14:29.580759954 +0000 UTC m=+170.115115674" observedRunningTime="2026-04-16 20:14:30.550110033 +0000 UTC m=+171.084465761" watchObservedRunningTime="2026-04-16 20:14:30.550817993 +0000 UTC m=+171.085173718" Apr 16 20:14:31.540362 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:31.540328 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2k2bj" event={"ID":"cb305b65-f801-4d89-b18c-c2f5be7436c5","Type":"ContainerStarted","Data":"143343697eb96a904533ea19df7c79e9beb8f7e294422b581a65a66bcf33c2e2"} Apr 16 20:14:32.544107 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:32.544078 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2k2bj" event={"ID":"cb305b65-f801-4d89-b18c-c2f5be7436c5","Type":"ContainerStarted","Data":"c80798aeff33138a029aa4d4de1383a578df777fc1b4b78fe21b2ed86f59dce4"} Apr 16 20:14:32.561870 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:32.561833 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2k2bj" podStartSLOduration=1.633347702 podStartE2EDuration="3.561819763s" podCreationTimestamp="2026-04-16 20:14:29 +0000 UTC" firstStartedPulling="2026-04-16 20:14:30.104032043 +0000 UTC m=+170.638387748" lastFinishedPulling="2026-04-16 20:14:32.03250409 +0000 UTC m=+172.566859809" observedRunningTime="2026-04-16 20:14:32.560774374 +0000 UTC m=+173.095130093" watchObservedRunningTime="2026-04-16 20:14:32.561819763 +0000 UTC m=+173.096175488" Apr 16 20:14:33.516079 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:33.516046 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-s265g" Apr 16 20:14:40.596091 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:40.596057 2576 patch_prober.go:28] interesting pod/image-registry-5c99487749-lnt44 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 20:14:40.596502 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:40.596115 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5c99487749-lnt44" podUID="c0fd6853-ffb5-4eab-9ba0-39451aeb636a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:14:42.513967 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:42.513936 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:14:43.018411 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.018380 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zgmbr"] Apr 16 20:14:43.022554 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.022538 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.025819 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.025788 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:14:43.025819 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.025788 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xns2x\"" Apr 16 20:14:43.026343 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.026326 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:14:43.026418 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.026401 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:14:43.026637 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.026603 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:14:43.026637 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.026623 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:14:43.026637 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.026609 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:14:43.044408 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.044376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-tls\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.044408 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.044413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-root\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.044537 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.044437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.044537 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.044459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-accelerators-collector-config\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.044537 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.044505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-wtmp\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.044537 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.044521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7266m\" (UniqueName: \"kubernetes.io/projected/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-kube-api-access-7266m\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.044702 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.044543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-sys\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.044702 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.044559 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-textfile\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.044702 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.044601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-metrics-client-ca\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.145632 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.145605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-wtmp\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.145632 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.145638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7266m\" (UniqueName: \"kubernetes.io/projected/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-kube-api-access-7266m\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.145802 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.145654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-sys\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.145802 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.145674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-textfile\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.145802 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.145691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-metrics-client-ca\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.145802 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.145738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-sys\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.145802 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.145765 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-wtmp\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.145802 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.145742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-tls\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.146037 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:14:43.145817 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:14:43.146037 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.145825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-root\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.146037 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:14:43.145870 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-tls podName:5aa991c5-99cc-4458-a0d4-0f785bb5b8cd nodeName:}" failed. No retries permitted until 2026-04-16 20:14:43.645852153 +0000 UTC m=+184.180207860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-tls") pod "node-exporter-zgmbr" (UID: "5aa991c5-99cc-4458-a0d4-0f785bb5b8cd") : secret "node-exporter-tls" not found Apr 16 20:14:43.146037 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.145891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.146037 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.145870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-root\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.146037 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.145921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-accelerators-collector-config\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.146318 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.146058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-textfile\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.146318 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.146265 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-metrics-client-ca\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.146392 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.146316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-accelerators-collector-config\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.148035 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.148014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.164961 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.164940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7266m\" (UniqueName: \"kubernetes.io/projected/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-kube-api-access-7266m\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.650141 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.650116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-tls\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.652264 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.652248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5aa991c5-99cc-4458-a0d4-0f785bb5b8cd-node-exporter-tls\") pod \"node-exporter-zgmbr\" (UID: \"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd\") " pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.931357 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:43.931290 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zgmbr" Apr 16 20:14:43.939618 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:14:43.939593 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aa991c5_99cc_4458_a0d4_0f785bb5b8cd.slice/crio-95d3058786d4ac24208f05fb3abced502e091e80ff84d4dc0431daac9a40a850 WatchSource:0}: Error finding container 95d3058786d4ac24208f05fb3abced502e091e80ff84d4dc0431daac9a40a850: Status 404 returned error can't find the container with id 95d3058786d4ac24208f05fb3abced502e091e80ff84d4dc0431daac9a40a850 Apr 16 20:14:44.574522 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:44.574488 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zgmbr" event={"ID":"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd","Type":"ContainerStarted","Data":"95d3058786d4ac24208f05fb3abced502e091e80ff84d4dc0431daac9a40a850"} Apr 16 20:14:45.578210 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:45.578177 2576 generic.go:358] "Generic (PLEG): container finished" podID="5aa991c5-99cc-4458-a0d4-0f785bb5b8cd" containerID="b710829a1c21e8811d081a354e3b56e353da28e0d869c8700b8fc04f47d527aa" exitCode=0 Apr 16 20:14:45.578655 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:45.578255 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zgmbr" event={"ID":"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd","Type":"ContainerDied","Data":"b710829a1c21e8811d081a354e3b56e353da28e0d869c8700b8fc04f47d527aa"} Apr 16 20:14:46.582387 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:46.582356 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zgmbr" event={"ID":"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd","Type":"ContainerStarted","Data":"d12378e745e5dd54c02f30a990b9aac3236321153b4ea737b2376e25261a5b85"} Apr 16 20:14:46.582387 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:46.582388 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zgmbr" event={"ID":"5aa991c5-99cc-4458-a0d4-0f785bb5b8cd","Type":"ContainerStarted","Data":"7e4a2f102b3fcfb34a001523117cc0d3545eacb36fd752972d5b5b0fc1bd0cf5"} Apr 16 20:14:46.604279 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:46.604235 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zgmbr" podStartSLOduration=3.859947837 podStartE2EDuration="4.604222396s" podCreationTimestamp="2026-04-16 20:14:42 +0000 UTC" firstStartedPulling="2026-04-16 20:14:43.941270409 +0000 UTC m=+184.475626114" lastFinishedPulling="2026-04-16 20:14:44.685544958 +0000 UTC m=+185.219900673" observedRunningTime="2026-04-16 20:14:46.603318384 +0000 UTC m=+187.137674114" watchObservedRunningTime="2026-04-16 20:14:46.604222396 +0000 UTC m=+187.138578121" Apr 16 20:14:51.742009 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:14:51.741964 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c99487749-lnt44"] Apr 16 20:15:02.777862 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:02.777822 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" podUID="900b92c0-dfa0-40fc-8682-a12bd5f43f14" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 20:15:12.776993 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:12.776940 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" podUID="900b92c0-dfa0-40fc-8682-a12bd5f43f14" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 20:15:16.762953 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:16.762905 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5c99487749-lnt44" podUID="c0fd6853-ffb5-4eab-9ba0-39451aeb636a" containerName="registry" containerID="cri-o://5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5" gracePeriod=30 Apr 16 20:15:16.992838 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:16.992820 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:15:17.080582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.080526 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") pod \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " Apr 16 20:15:17.080582 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.080576 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-installation-pull-secrets\") pod \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " Apr 16 20:15:17.080720 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.080688 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-bound-sa-token\") pod \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " Apr 16 20:15:17.080771 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.080719 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-certificates\") pod \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " Apr 16 20:15:17.080771 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.080743 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-image-registry-private-configuration\") pod \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " Apr 16 20:15:17.080771 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.080763 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9k49\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-kube-api-access-v9k49\") pod \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " Apr 16 20:15:17.080909 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.080822 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-trusted-ca\") pod \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " Apr 16 20:15:17.080909 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.080857 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-ca-trust-extracted\") pod \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\" (UID: \"c0fd6853-ffb5-4eab-9ba0-39451aeb636a\") " Apr 16 20:15:17.081287 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.081257 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c0fd6853-ffb5-4eab-9ba0-39451aeb636a" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:15:17.081385 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.081338 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c0fd6853-ffb5-4eab-9ba0-39451aeb636a" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:15:17.083373 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.083342 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c0fd6853-ffb5-4eab-9ba0-39451aeb636a" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:15:17.083474 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.083391 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c0fd6853-ffb5-4eab-9ba0-39451aeb636a" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:15:17.083474 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.083392 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-kube-api-access-v9k49" (OuterVolumeSpecName: "kube-api-access-v9k49") pod "c0fd6853-ffb5-4eab-9ba0-39451aeb636a" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a"). InnerVolumeSpecName "kube-api-access-v9k49". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:15:17.083474 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.083402 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c0fd6853-ffb5-4eab-9ba0-39451aeb636a" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:15:17.083590 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.083484 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c0fd6853-ffb5-4eab-9ba0-39451aeb636a" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:15:17.091242 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.091218 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c0fd6853-ffb5-4eab-9ba0-39451aeb636a" (UID: "c0fd6853-ffb5-4eab-9ba0-39451aeb636a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:15:17.181899 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.181866 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-trusted-ca\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:15:17.181899 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.181899 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-ca-trust-extracted\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:15:17.182063 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.181915 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-tls\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:15:17.182063 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.181928 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-installation-pull-secrets\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:15:17.182063 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.181940 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-bound-sa-token\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:15:17.182063 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.181952 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-registry-certificates\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:15:17.182063 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.181965 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-image-registry-private-configuration\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:15:17.182063 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.181999 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v9k49\" (UniqueName: \"kubernetes.io/projected/c0fd6853-ffb5-4eab-9ba0-39451aeb636a-kube-api-access-v9k49\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:15:17.654210 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.654172 2576 generic.go:358] "Generic (PLEG): container finished" podID="c0fd6853-ffb5-4eab-9ba0-39451aeb636a" containerID="5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5" exitCode=0 Apr 16 20:15:17.654366 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.654233 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c99487749-lnt44" Apr 16 20:15:17.654366 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.654259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c99487749-lnt44" event={"ID":"c0fd6853-ffb5-4eab-9ba0-39451aeb636a","Type":"ContainerDied","Data":"5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5"} Apr 16 20:15:17.654366 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.654293 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c99487749-lnt44" event={"ID":"c0fd6853-ffb5-4eab-9ba0-39451aeb636a","Type":"ContainerDied","Data":"af2643ead5fe8d31e4b0919ef471333a98f55cb0a6b88318a339d96509ccd2bd"} Apr 16 20:15:17.654366 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.654307 2576 scope.go:117] "RemoveContainer" containerID="5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5" Apr 16 20:15:17.662128 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.662042 2576 scope.go:117] "RemoveContainer" containerID="5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5" Apr 16 20:15:17.662352 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:15:17.662322 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5\": container with ID starting with 5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5 not found: ID does not exist" containerID="5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5" Apr 16 20:15:17.662464 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.662365 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5"} err="failed to get container status \"5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5\": rpc error: code = NotFound desc = could not find container \"5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5\": container with ID starting with 5d572a08b3c56cdf1e68f42aebdcb1197fb1b1ff9570a55ce8c482397e8e28d5 not found: ID does not exist" Apr 16 20:15:17.677162 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.677138 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c99487749-lnt44"] Apr 16 20:15:17.684697 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:17.684672 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5c99487749-lnt44"] Apr 16 20:15:18.058554 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:18.058524 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0fd6853-ffb5-4eab-9ba0-39451aeb636a" path="/var/lib/kubelet/pods/c0fd6853-ffb5-4eab-9ba0-39451aeb636a/volumes" Apr 16 20:15:19.184704 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:19.184679 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zgmbr_5aa991c5-99cc-4458-a0d4-0f785bb5b8cd/init-textfile/0.log" Apr 16 20:15:19.382864 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:19.382842 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zgmbr_5aa991c5-99cc-4458-a0d4-0f785bb5b8cd/node-exporter/0.log" Apr 16 20:15:19.581698 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:19.581675 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zgmbr_5aa991c5-99cc-4458-a0d4-0f785bb5b8cd/kube-rbac-proxy/0.log" Apr 16 20:15:22.777061 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:22.777023 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" podUID="900b92c0-dfa0-40fc-8682-a12bd5f43f14" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 20:15:22.777427 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:22.777087 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" Apr 16 20:15:22.777562 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:22.777544 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"180b905494b86e2dd29ae19c328c2fddf6bd7c5f008fb6b6f1d1e128cabe8214"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 20:15:22.777597 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:22.777581 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" podUID="900b92c0-dfa0-40fc-8682-a12bd5f43f14" containerName="service-proxy" containerID="cri-o://180b905494b86e2dd29ae19c328c2fddf6bd7c5f008fb6b6f1d1e128cabe8214" gracePeriod=30 Apr 16 20:15:23.671390 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:23.671355 2576 generic.go:358] "Generic (PLEG): container finished" podID="900b92c0-dfa0-40fc-8682-a12bd5f43f14" containerID="180b905494b86e2dd29ae19c328c2fddf6bd7c5f008fb6b6f1d1e128cabe8214" exitCode=2 Apr 16 20:15:23.671578 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:23.671425 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" event={"ID":"900b92c0-dfa0-40fc-8682-a12bd5f43f14","Type":"ContainerDied","Data":"180b905494b86e2dd29ae19c328c2fddf6bd7c5f008fb6b6f1d1e128cabe8214"} Apr 16 20:15:23.671578 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:23.671467 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59965559f7-nqpnv" event={"ID":"900b92c0-dfa0-40fc-8682-a12bd5f43f14","Type":"ContainerStarted","Data":"54332437e3578e09f3390bce8e3425af2634665fc8dc7cd0298c00c1e995b32f"} Apr 16 20:15:51.820440 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:51.820347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:15:51.822662 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:51.822635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6bfc3ff-419c-47de-880a-e6eeafc7247a-metrics-certs\") pod \"network-metrics-daemon-q46p8\" (UID: \"d6bfc3ff-419c-47de-880a-e6eeafc7247a\") " pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:15:52.057960 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:52.057929 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8662n\"" Apr 16 20:15:52.066244 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:52.066224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q46p8" Apr 16 20:15:52.193403 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:52.193376 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q46p8"] Apr 16 20:15:52.196327 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:15:52.196300 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6bfc3ff_419c_47de_880a_e6eeafc7247a.slice/crio-35a95cc40d26a6755f41a6e597c55c3527d88623a0a2df60cef6e84f3b9ce510 WatchSource:0}: Error finding container 35a95cc40d26a6755f41a6e597c55c3527d88623a0a2df60cef6e84f3b9ce510: Status 404 returned error can't find the container with id 35a95cc40d26a6755f41a6e597c55c3527d88623a0a2df60cef6e84f3b9ce510 Apr 16 20:15:52.744838 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:52.744805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q46p8" event={"ID":"d6bfc3ff-419c-47de-880a-e6eeafc7247a","Type":"ContainerStarted","Data":"35a95cc40d26a6755f41a6e597c55c3527d88623a0a2df60cef6e84f3b9ce510"} Apr 16 20:15:53.748554 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:53.748521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q46p8" event={"ID":"d6bfc3ff-419c-47de-880a-e6eeafc7247a","Type":"ContainerStarted","Data":"5253cfae753f6c5ed244cd0a7f4bcc87da7c3aeea8edf9778ae32df83bfe01b6"} Apr 16 20:15:53.748554 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:53.748554 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q46p8" event={"ID":"d6bfc3ff-419c-47de-880a-e6eeafc7247a","Type":"ContainerStarted","Data":"66aa04b3a263404de186d4dc03e46dff0730c05036fb057976a3388506e7c422"} Apr 16 20:15:53.764965 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:15:53.764921 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q46p8" podStartSLOduration=252.643752274 podStartE2EDuration="4m13.764908608s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:15:52.198187194 +0000 UTC m=+252.732542904" lastFinishedPulling="2026-04-16 20:15:53.319343534 +0000 UTC m=+253.853699238" observedRunningTime="2026-04-16 20:15:53.763801379 +0000 UTC m=+254.298157099" watchObservedRunningTime="2026-04-16 20:15:53.764908608 +0000 UTC m=+254.299264333" Apr 16 20:16:39.930003 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:16:39.929953 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:17:29.509156 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.509077 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv"] Apr 16 20:17:29.509646 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.509331 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0fd6853-ffb5-4eab-9ba0-39451aeb636a" containerName="registry" Apr 16 20:17:29.509646 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.509342 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0fd6853-ffb5-4eab-9ba0-39451aeb636a" containerName="registry" Apr 16 20:17:29.509646 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.509382 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0fd6853-ffb5-4eab-9ba0-39451aeb636a" containerName="registry" Apr 16 20:17:29.511920 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.511904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:29.514224 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.514200 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 20:17:29.514318 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.514299 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 20:17:29.514440 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.514419 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 20:17:29.514440 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.514419 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 20:17:29.515014 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.515000 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 20:17:29.515154 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.515139 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-r6d72\"" Apr 16 20:17:29.520082 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.520059 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv"] Apr 16 20:17:29.628746 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.628729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mrztv\" (UID: \"602cb535-91e9-4265-ae2b-4d223b435d42\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:29.628832 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.628767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj4dr\" (UniqueName: \"kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-kube-api-access-wj4dr\") pod \"keda-metrics-apiserver-7c9f485588-mrztv\" (UID: \"602cb535-91e9-4265-ae2b-4d223b435d42\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:29.628832 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.628789 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/602cb535-91e9-4265-ae2b-4d223b435d42-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-mrztv\" (UID: \"602cb535-91e9-4265-ae2b-4d223b435d42\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:29.728991 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.728952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mrztv\" (UID: \"602cb535-91e9-4265-ae2b-4d223b435d42\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:29.729087 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.729023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj4dr\" (UniqueName: \"kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-kube-api-access-wj4dr\") pod \"keda-metrics-apiserver-7c9f485588-mrztv\" (UID: \"602cb535-91e9-4265-ae2b-4d223b435d42\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:29.729087 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.729061 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/602cb535-91e9-4265-ae2b-4d223b435d42-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-mrztv\" (UID: \"602cb535-91e9-4265-ae2b-4d223b435d42\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:29.729087 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:29.729070 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 20:17:29.729087 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:29.729084 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 20:17:29.729341 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:29.729106 2576 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 20:17:29.729341 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:29.729124 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 20:17:29.729341 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:29.729177 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-certificates podName:602cb535-91e9-4265-ae2b-4d223b435d42 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:30.229160845 +0000 UTC m=+350.763516552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-certificates") pod "keda-metrics-apiserver-7c9f485588-mrztv" (UID: "602cb535-91e9-4265-ae2b-4d223b435d42") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 20:17:29.729481 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.729410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/602cb535-91e9-4265-ae2b-4d223b435d42-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-mrztv\" (UID: \"602cb535-91e9-4265-ae2b-4d223b435d42\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:29.737331 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.737307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj4dr\" (UniqueName: \"kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-kube-api-access-wj4dr\") pod \"keda-metrics-apiserver-7c9f485588-mrztv\" (UID: \"602cb535-91e9-4265-ae2b-4d223b435d42\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:29.807933 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.807857 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-t2k2d"] Apr 16 20:17:29.810725 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.810708 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-t2k2d" Apr 16 20:17:29.812813 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.812797 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 20:17:29.818723 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.818700 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-t2k2d"] Apr 16 20:17:29.929817 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.929791 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d527cf33-d5f8-41b9-ada4-e4602619f0de-certificates\") pod \"keda-admission-cf49989db-t2k2d\" (UID: \"d527cf33-d5f8-41b9-ada4-e4602619f0de\") " pod="openshift-keda/keda-admission-cf49989db-t2k2d" Apr 16 20:17:29.929944 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:29.929852 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rvfq\" (UniqueName: \"kubernetes.io/projected/d527cf33-d5f8-41b9-ada4-e4602619f0de-kube-api-access-6rvfq\") pod \"keda-admission-cf49989db-t2k2d\" (UID: \"d527cf33-d5f8-41b9-ada4-e4602619f0de\") " pod="openshift-keda/keda-admission-cf49989db-t2k2d" Apr 16 20:17:30.030199 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:30.030174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rvfq\" (UniqueName: \"kubernetes.io/projected/d527cf33-d5f8-41b9-ada4-e4602619f0de-kube-api-access-6rvfq\") pod \"keda-admission-cf49989db-t2k2d\" (UID: \"d527cf33-d5f8-41b9-ada4-e4602619f0de\") " pod="openshift-keda/keda-admission-cf49989db-t2k2d" Apr 16 20:17:30.030370 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:30.030215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d527cf33-d5f8-41b9-ada4-e4602619f0de-certificates\") pod \"keda-admission-cf49989db-t2k2d\" (UID: \"d527cf33-d5f8-41b9-ada4-e4602619f0de\") " pod="openshift-keda/keda-admission-cf49989db-t2k2d" Apr 16 20:17:30.034645 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:30.034625 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d527cf33-d5f8-41b9-ada4-e4602619f0de-certificates\") pod \"keda-admission-cf49989db-t2k2d\" (UID: \"d527cf33-d5f8-41b9-ada4-e4602619f0de\") " pod="openshift-keda/keda-admission-cf49989db-t2k2d" Apr 16 20:17:30.037920 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:30.037898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rvfq\" (UniqueName: \"kubernetes.io/projected/d527cf33-d5f8-41b9-ada4-e4602619f0de-kube-api-access-6rvfq\") pod \"keda-admission-cf49989db-t2k2d\" (UID: \"d527cf33-d5f8-41b9-ada4-e4602619f0de\") " pod="openshift-keda/keda-admission-cf49989db-t2k2d" Apr 16 20:17:30.119700 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:30.119647 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-t2k2d" Apr 16 20:17:30.230500 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:30.230476 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-t2k2d"] Apr 16 20:17:30.232118 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:30.232099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mrztv\" (UID: \"602cb535-91e9-4265-ae2b-4d223b435d42\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:30.232227 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:30.232216 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 20:17:30.232261 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:30.232232 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 20:17:30.232261 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:30.232248 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv: references non-existent secret key: tls.crt Apr 16 20:17:30.232319 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:30.232293 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-certificates podName:602cb535-91e9-4265-ae2b-4d223b435d42 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:31.232279108 +0000 UTC m=+351.766634812 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-certificates") pod "keda-metrics-apiserver-7c9f485588-mrztv" (UID: "602cb535-91e9-4265-ae2b-4d223b435d42") : references non-existent secret key: tls.crt Apr 16 20:17:30.232789 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:17:30.232770 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd527cf33_d5f8_41b9_ada4_e4602619f0de.slice/crio-436d4bed4eca67405b3bba305083ac16ef47a0de35b77fa9d1c1f1fcf102b593 WatchSource:0}: Error finding container 436d4bed4eca67405b3bba305083ac16ef47a0de35b77fa9d1c1f1fcf102b593: Status 404 returned error can't find the container with id 436d4bed4eca67405b3bba305083ac16ef47a0de35b77fa9d1c1f1fcf102b593 Apr 16 20:17:30.233873 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:30.233856 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:17:30.978481 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:30.978436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-t2k2d" event={"ID":"d527cf33-d5f8-41b9-ada4-e4602619f0de","Type":"ContainerStarted","Data":"436d4bed4eca67405b3bba305083ac16ef47a0de35b77fa9d1c1f1fcf102b593"} Apr 16 20:17:31.238705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:31.238635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mrztv\" (UID: \"602cb535-91e9-4265-ae2b-4d223b435d42\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:31.238832 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:31.238799 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 20:17:31.238832 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:31.238820 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 20:17:31.238918 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:31.238843 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv: references non-existent secret key: tls.crt Apr 16 20:17:31.238960 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:17:31.238928 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-certificates podName:602cb535-91e9-4265-ae2b-4d223b435d42 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:33.238912909 +0000 UTC m=+353.773268618 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-certificates") pod "keda-metrics-apiserver-7c9f485588-mrztv" (UID: "602cb535-91e9-4265-ae2b-4d223b435d42") : references non-existent secret key: tls.crt Apr 16 20:17:32.984810 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:32.984777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-t2k2d" event={"ID":"d527cf33-d5f8-41b9-ada4-e4602619f0de","Type":"ContainerStarted","Data":"2c62e2a9062841af064a5f239915bab3e39b120a18593bce539e1ff7bf7c3083"} Apr 16 20:17:32.985189 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:32.984888 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-t2k2d" Apr 16 20:17:33.006342 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:33.006304 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-t2k2d" podStartSLOduration=1.766667091 podStartE2EDuration="4.00629249s" podCreationTimestamp="2026-04-16 20:17:29 +0000 UTC" firstStartedPulling="2026-04-16 20:17:30.234028378 +0000 UTC m=+350.768384096" lastFinishedPulling="2026-04-16 20:17:32.473653791 +0000 UTC m=+353.008009495" observedRunningTime="2026-04-16 20:17:33.006021119 +0000 UTC m=+353.540376841" watchObservedRunningTime="2026-04-16 20:17:33.00629249 +0000 UTC m=+353.540648216" Apr 16 20:17:33.253862 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:33.253804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mrztv\" (UID: \"602cb535-91e9-4265-ae2b-4d223b435d42\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:33.256188 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:33.256169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/602cb535-91e9-4265-ae2b-4d223b435d42-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mrztv\" (UID: \"602cb535-91e9-4265-ae2b-4d223b435d42\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:33.422031 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:33.422010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:33.544202 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:33.544140 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv"] Apr 16 20:17:33.547009 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:17:33.546969 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod602cb535_91e9_4265_ae2b_4d223b435d42.slice/crio-f02e577ed0faf99530c41028d19cab7782c19702771d6d6749bb687035da9439 WatchSource:0}: Error finding container f02e577ed0faf99530c41028d19cab7782c19702771d6d6749bb687035da9439: Status 404 returned error can't find the container with id f02e577ed0faf99530c41028d19cab7782c19702771d6d6749bb687035da9439 Apr 16 20:17:33.989966 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:33.989935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" event={"ID":"602cb535-91e9-4265-ae2b-4d223b435d42","Type":"ContainerStarted","Data":"f02e577ed0faf99530c41028d19cab7782c19702771d6d6749bb687035da9439"} Apr 16 20:17:37.002314 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:37.002282 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" event={"ID":"602cb535-91e9-4265-ae2b-4d223b435d42","Type":"ContainerStarted","Data":"2ca7563c4a56b927ff1e9bb1e27b6d50f92924f6df3948baa8db8d72bc79c434"} Apr 16 20:17:37.002697 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:37.002406 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:37.043795 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:37.043757 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" podStartSLOduration=5.5709676009999995 podStartE2EDuration="8.04374754s" podCreationTimestamp="2026-04-16 20:17:29 +0000 UTC" firstStartedPulling="2026-04-16 20:17:33.548296243 +0000 UTC m=+354.082651948" lastFinishedPulling="2026-04-16 20:17:36.02107617 +0000 UTC m=+356.555431887" observedRunningTime="2026-04-16 20:17:37.0435706 +0000 UTC m=+357.577926325" watchObservedRunningTime="2026-04-16 20:17:37.04374754 +0000 UTC m=+357.578103266" Apr 16 20:17:48.009650 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:48.009614 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mrztv" Apr 16 20:17:53.992242 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:17:53.992213 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-t2k2d" Apr 16 20:18:59.465558 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.465483 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d"] Apr 16 20:18:59.468333 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.468317 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d" Apr 16 20:18:59.470719 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.470701 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:18:59.470719 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.470701 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-czp9b\"" Apr 16 20:18:59.471661 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.471645 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 20:18:59.477401 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.477382 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d"] Apr 16 20:18:59.603990 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.603947 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/85ee8ad9-c0fa-4b55-8366-030eac073f32-tmp\") pod \"openshift-lws-operator-bfc7f696d-sxq4d\" (UID: \"85ee8ad9-c0fa-4b55-8366-030eac073f32\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d" Apr 16 20:18:59.604133 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.603996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7752w\" (UniqueName: \"kubernetes.io/projected/85ee8ad9-c0fa-4b55-8366-030eac073f32-kube-api-access-7752w\") pod \"openshift-lws-operator-bfc7f696d-sxq4d\" (UID: \"85ee8ad9-c0fa-4b55-8366-030eac073f32\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d" Apr 16 20:18:59.704496 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.704474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/85ee8ad9-c0fa-4b55-8366-030eac073f32-tmp\") pod \"openshift-lws-operator-bfc7f696d-sxq4d\" (UID: \"85ee8ad9-c0fa-4b55-8366-030eac073f32\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d" Apr 16 20:18:59.704588 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.704503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7752w\" (UniqueName: \"kubernetes.io/projected/85ee8ad9-c0fa-4b55-8366-030eac073f32-kube-api-access-7752w\") pod \"openshift-lws-operator-bfc7f696d-sxq4d\" (UID: \"85ee8ad9-c0fa-4b55-8366-030eac073f32\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d" Apr 16 20:18:59.704845 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.704824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/85ee8ad9-c0fa-4b55-8366-030eac073f32-tmp\") pod \"openshift-lws-operator-bfc7f696d-sxq4d\" (UID: \"85ee8ad9-c0fa-4b55-8366-030eac073f32\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d" Apr 16 20:18:59.713675 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.713648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7752w\" (UniqueName: \"kubernetes.io/projected/85ee8ad9-c0fa-4b55-8366-030eac073f32-kube-api-access-7752w\") pod \"openshift-lws-operator-bfc7f696d-sxq4d\" (UID: \"85ee8ad9-c0fa-4b55-8366-030eac073f32\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d" Apr 16 20:18:59.777189 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.777134 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d" Apr 16 20:18:59.890871 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:18:59.890846 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d"] Apr 16 20:18:59.893575 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:18:59.893548 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85ee8ad9_c0fa_4b55_8366_030eac073f32.slice/crio-88c180e58db0602e183bf4adfb8711a4f43190094dadd9db0326a834a9557c33 WatchSource:0}: Error finding container 88c180e58db0602e183bf4adfb8711a4f43190094dadd9db0326a834a9557c33: Status 404 returned error can't find the container with id 88c180e58db0602e183bf4adfb8711a4f43190094dadd9db0326a834a9557c33 Apr 16 20:19:00.203355 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:00.203327 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d" event={"ID":"85ee8ad9-c0fa-4b55-8366-030eac073f32","Type":"ContainerStarted","Data":"88c180e58db0602e183bf4adfb8711a4f43190094dadd9db0326a834a9557c33"} Apr 16 20:19:03.212189 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:03.212154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d" event={"ID":"85ee8ad9-c0fa-4b55-8366-030eac073f32","Type":"ContainerStarted","Data":"db53cdedbafed2ddede8b4740f9f8963ba838fc8c0d667991fb36920711356bc"} Apr 16 20:19:03.228117 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:03.228076 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sxq4d" podStartSLOduration=1.139485718 podStartE2EDuration="4.22806377s" podCreationTimestamp="2026-04-16 20:18:59 +0000 UTC" firstStartedPulling="2026-04-16 20:18:59.89488738 +0000 UTC m=+440.429243087" lastFinishedPulling="2026-04-16 20:19:02.983465419 +0000 UTC m=+443.517821139" observedRunningTime="2026-04-16 20:19:03.226996635 +0000 UTC m=+443.761352352" watchObservedRunningTime="2026-04-16 20:19:03.22806377 +0000 UTC m=+443.762419496" Apr 16 20:19:30.654437 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.654376 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl"] Apr 16 20:19:30.657526 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.657505 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" Apr 16 20:19:30.660265 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.660241 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-h52wm\"" Apr 16 20:19:30.660409 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.660389 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 20:19:30.660578 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.660560 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 20:19:30.680086 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.680064 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl"] Apr 16 20:19:30.718410 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.718382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/b8d05626-6f59-419a-9575-7c2764251e7f-operator-config\") pod \"servicemesh-operator3-55f49c5f94-c2jbl\" (UID: \"b8d05626-6f59-419a-9575-7c2764251e7f\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" Apr 16 20:19:30.718537 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.718442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wzbq\" (UniqueName: \"kubernetes.io/projected/b8d05626-6f59-419a-9575-7c2764251e7f-kube-api-access-7wzbq\") pod \"servicemesh-operator3-55f49c5f94-c2jbl\" (UID: \"b8d05626-6f59-419a-9575-7c2764251e7f\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" Apr 16 20:19:30.819073 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.819040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/b8d05626-6f59-419a-9575-7c2764251e7f-operator-config\") pod \"servicemesh-operator3-55f49c5f94-c2jbl\" (UID: \"b8d05626-6f59-419a-9575-7c2764251e7f\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" Apr 16 20:19:30.819201 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.819082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wzbq\" (UniqueName: \"kubernetes.io/projected/b8d05626-6f59-419a-9575-7c2764251e7f-kube-api-access-7wzbq\") pod \"servicemesh-operator3-55f49c5f94-c2jbl\" (UID: \"b8d05626-6f59-419a-9575-7c2764251e7f\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" Apr 16 20:19:30.821367 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.821346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/b8d05626-6f59-419a-9575-7c2764251e7f-operator-config\") pod \"servicemesh-operator3-55f49c5f94-c2jbl\" (UID: \"b8d05626-6f59-419a-9575-7c2764251e7f\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" Apr 16 20:19:30.831054 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.831033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wzbq\" (UniqueName: \"kubernetes.io/projected/b8d05626-6f59-419a-9575-7c2764251e7f-kube-api-access-7wzbq\") pod \"servicemesh-operator3-55f49c5f94-c2jbl\" (UID: \"b8d05626-6f59-419a-9575-7c2764251e7f\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" Apr 16 20:19:30.966128 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:30.966104 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" Apr 16 20:19:31.084745 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:31.084716 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl"] Apr 16 20:19:31.087674 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:19:31.087649 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8d05626_6f59_419a_9575_7c2764251e7f.slice/crio-9f5cf4ddba52f92a4a2984abc8e0ec06f492e75f60fbb76e45f6bdcfe9f5178b WatchSource:0}: Error finding container 9f5cf4ddba52f92a4a2984abc8e0ec06f492e75f60fbb76e45f6bdcfe9f5178b: Status 404 returned error can't find the container with id 9f5cf4ddba52f92a4a2984abc8e0ec06f492e75f60fbb76e45f6bdcfe9f5178b Apr 16 20:19:31.283630 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:31.283543 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" event={"ID":"b8d05626-6f59-419a-9575-7c2764251e7f","Type":"ContainerStarted","Data":"9f5cf4ddba52f92a4a2984abc8e0ec06f492e75f60fbb76e45f6bdcfe9f5178b"} Apr 16 20:19:34.294629 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:34.294592 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" event={"ID":"b8d05626-6f59-419a-9575-7c2764251e7f","Type":"ContainerStarted","Data":"6b5ec033f39bc6b164a2f350b2b0e160c6ba30f53e8d8641d80f88e5f2753d47"} Apr 16 20:19:34.295046 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:34.294751 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" Apr 16 20:19:34.322249 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:34.322203 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" podStartSLOduration=1.812037942 podStartE2EDuration="4.322189438s" podCreationTimestamp="2026-04-16 20:19:30 +0000 UTC" firstStartedPulling="2026-04-16 20:19:31.08990003 +0000 UTC m=+471.624255735" lastFinishedPulling="2026-04-16 20:19:33.600051519 +0000 UTC m=+474.134407231" observedRunningTime="2026-04-16 20:19:34.320355968 +0000 UTC m=+474.854711693" watchObservedRunningTime="2026-04-16 20:19:34.322189438 +0000 UTC m=+474.856545174" Apr 16 20:19:45.300597 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:45.300571 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-c2jbl" Apr 16 20:19:45.890496 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:45.890466 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7"] Apr 16 20:19:45.895551 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:45.895529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:45.897748 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:45.897725 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 20:19:45.897826 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:45.897800 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 20:19:45.898389 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:45.898371 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 20:19:45.898488 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:45.898412 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 20:19:45.898543 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:45.898534 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 20:19:45.898756 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:45.898741 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-w8zpm\"" Apr 16 20:19:45.898807 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:45.898741 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 20:19:45.912453 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:45.912421 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7"] Apr 16 20:19:46.025733 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.025706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwz6s\" (UniqueName: \"kubernetes.io/projected/c538862c-e47e-4237-abb4-584ddf633e15-kube-api-access-wwz6s\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.025849 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.025751 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c538862c-e47e-4237-abb4-584ddf633e15-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.025849 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.025772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.025849 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.025798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.025849 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.025823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c538862c-e47e-4237-abb4-584ddf633e15-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.026062 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.025890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c538862c-e47e-4237-abb4-584ddf633e15-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.026062 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.025932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.126721 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.126689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c538862c-e47e-4237-abb4-584ddf633e15-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.126897 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.126730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.126897 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.126853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.127041 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.126915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c538862c-e47e-4237-abb4-584ddf633e15-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.127041 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.126948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c538862c-e47e-4237-abb4-584ddf633e15-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.127041 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.126971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.127186 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.127143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwz6s\" (UniqueName: \"kubernetes.io/projected/c538862c-e47e-4237-abb4-584ddf633e15-kube-api-access-wwz6s\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.127402 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.127380 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c538862c-e47e-4237-abb4-584ddf633e15-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.129171 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.129148 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.129488 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.129455 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.129567 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.129552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c538862c-e47e-4237-abb4-584ddf633e15-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.129605 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.129572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.142721 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.142675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwz6s\" (UniqueName: \"kubernetes.io/projected/c538862c-e47e-4237-abb4-584ddf633e15-kube-api-access-wwz6s\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.142801 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.142786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c538862c-e47e-4237-abb4-584ddf633e15-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-w9zn7\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.205203 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.204758 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:46.346380 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:46.346346 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7"] Apr 16 20:19:46.348816 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:19:46.348786 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc538862c_e47e_4237_abb4_584ddf633e15.slice/crio-f1fd9b4e2d67afab2ddaf3f89058898cc9cc7554e9edce0a05e831c4c551a688 WatchSource:0}: Error finding container f1fd9b4e2d67afab2ddaf3f89058898cc9cc7554e9edce0a05e831c4c551a688: Status 404 returned error can't find the container with id f1fd9b4e2d67afab2ddaf3f89058898cc9cc7554e9edce0a05e831c4c551a688 Apr 16 20:19:47.335742 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:47.335701 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" event={"ID":"c538862c-e47e-4237-abb4-584ddf633e15","Type":"ContainerStarted","Data":"f1fd9b4e2d67afab2ddaf3f89058898cc9cc7554e9edce0a05e831c4c551a688"} Apr 16 20:19:49.023241 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:49.023197 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:19:49.023617 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:49.023277 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:19:49.343076 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:49.342967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" event={"ID":"c538862c-e47e-4237-abb4-584ddf633e15","Type":"ContainerStarted","Data":"2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9"} Apr 16 20:19:49.343230 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:49.343106 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:49.361642 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:49.361594 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" podStartSLOduration=1.689169749 podStartE2EDuration="4.361582098s" podCreationTimestamp="2026-04-16 20:19:45 +0000 UTC" firstStartedPulling="2026-04-16 20:19:46.350518562 +0000 UTC m=+486.884874266" lastFinishedPulling="2026-04-16 20:19:49.022930911 +0000 UTC m=+489.557286615" observedRunningTime="2026-04-16 20:19:49.360330069 +0000 UTC m=+489.894685794" watchObservedRunningTime="2026-04-16 20:19:49.361582098 +0000 UTC m=+489.895937824" Apr 16 20:19:50.347368 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:50.347338 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:19:51.911285 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:51.911250 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf"] Apr 16 20:19:51.914405 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:51.914388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:51.916621 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:51.916598 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-4lq4l\"" Apr 16 20:19:51.924753 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:51.924733 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf"] Apr 16 20:19:52.076824 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.076788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d252744b-82fe-4054-8a24-f06d1b966faa-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.076999 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.076831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.076999 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.076856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbzb7\" (UniqueName: \"kubernetes.io/projected/d252744b-82fe-4054-8a24-f06d1b966faa-kube-api-access-nbzb7\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.076999 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.076923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.076999 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.076961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.077161 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.077022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.077161 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.077064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.077161 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.077084 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d252744b-82fe-4054-8a24-f06d1b966faa-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.077161 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.077114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d252744b-82fe-4054-8a24-f06d1b966faa-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.178085 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.177972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d252744b-82fe-4054-8a24-f06d1b966faa-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.178085 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.178055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d252744b-82fe-4054-8a24-f06d1b966faa-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.178352 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.178088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.178352 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.178114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbzb7\" (UniqueName: \"kubernetes.io/projected/d252744b-82fe-4054-8a24-f06d1b966faa-kube-api-access-nbzb7\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.178352 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.178147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.178352 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.178176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.178352 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.178205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.178352 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.178247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.178352 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.178273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d252744b-82fe-4054-8a24-f06d1b966faa-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.179077 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.179048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.179424 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.179399 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.179510 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.179421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.179510 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.179430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.179855 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.179829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d252744b-82fe-4054-8a24-f06d1b966faa-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.180727 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.180679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d252744b-82fe-4054-8a24-f06d1b966faa-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.180907 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.180885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d252744b-82fe-4054-8a24-f06d1b966faa-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.185906 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.185879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d252744b-82fe-4054-8a24-f06d1b966faa-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.186152 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.186136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbzb7\" (UniqueName: \"kubernetes.io/projected/d252744b-82fe-4054-8a24-f06d1b966faa-kube-api-access-nbzb7\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-n9qgf\" (UID: \"d252744b-82fe-4054-8a24-f06d1b966faa\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.225485 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.225460 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:52.353586 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:52.353558 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf"] Apr 16 20:19:52.356234 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:19:52.356203 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd252744b_82fe_4054_8a24_f06d1b966faa.slice/crio-b13503023b1798f7fb4c6a2b6b9f72ed717cef6064e2106973126f44bb3950ed WatchSource:0}: Error finding container b13503023b1798f7fb4c6a2b6b9f72ed717cef6064e2106973126f44bb3950ed: Status 404 returned error can't find the container with id b13503023b1798f7fb4c6a2b6b9f72ed717cef6064e2106973126f44bb3950ed Apr 16 20:19:53.356414 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:53.356373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" event={"ID":"d252744b-82fe-4054-8a24-f06d1b966faa","Type":"ContainerStarted","Data":"b13503023b1798f7fb4c6a2b6b9f72ed717cef6064e2106973126f44bb3950ed"} Apr 16 20:19:54.491325 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:54.491289 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:19:54.491578 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:54.491368 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:19:54.491578 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:54.491397 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:19:55.363791 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:55.363757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" event={"ID":"d252744b-82fe-4054-8a24-f06d1b966faa","Type":"ContainerStarted","Data":"a2eee0d0159be047d01aa36b42498d04e5667fa84e35b813a00e66b4e6cc7651"} Apr 16 20:19:55.385042 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:55.384996 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" podStartSLOduration=2.251974869 podStartE2EDuration="4.384964841s" podCreationTimestamp="2026-04-16 20:19:51 +0000 UTC" firstStartedPulling="2026-04-16 20:19:52.358071471 +0000 UTC m=+492.892427175" lastFinishedPulling="2026-04-16 20:19:54.491061426 +0000 UTC m=+495.025417147" observedRunningTime="2026-04-16 20:19:55.382780304 +0000 UTC m=+495.917136052" watchObservedRunningTime="2026-04-16 20:19:55.384964841 +0000 UTC m=+495.919320566" Apr 16 20:19:56.226577 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:56.226552 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:56.231129 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:56.231105 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:56.366245 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:56.366223 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:19:56.367056 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:19:56.367038 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-n9qgf" Apr 16 20:20:11.201771 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:11.201739 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-f7wmw"] Apr 16 20:20:11.213229 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:11.213196 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-f7wmw" Apr 16 20:20:11.213858 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:11.213834 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-f7wmw"] Apr 16 20:20:11.215443 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:11.215423 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 20:20:11.215561 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:11.215513 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-tpbzc\"" Apr 16 20:20:11.216246 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:11.216231 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 20:20:11.317364 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:11.317331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g5kj\" (UniqueName: \"kubernetes.io/projected/6fe35cd4-e497-4c2f-99f8-55304c5d108f-kube-api-access-7g5kj\") pod \"authorino-operator-7587b89b76-f7wmw\" (UID: \"6fe35cd4-e497-4c2f-99f8-55304c5d108f\") " pod="kuadrant-system/authorino-operator-7587b89b76-f7wmw" Apr 16 20:20:11.417968 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:11.417938 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7g5kj\" (UniqueName: \"kubernetes.io/projected/6fe35cd4-e497-4c2f-99f8-55304c5d108f-kube-api-access-7g5kj\") pod \"authorino-operator-7587b89b76-f7wmw\" (UID: \"6fe35cd4-e497-4c2f-99f8-55304c5d108f\") " pod="kuadrant-system/authorino-operator-7587b89b76-f7wmw" Apr 16 20:20:11.425809 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:11.425778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g5kj\" (UniqueName: \"kubernetes.io/projected/6fe35cd4-e497-4c2f-99f8-55304c5d108f-kube-api-access-7g5kj\") pod \"authorino-operator-7587b89b76-f7wmw\" (UID: \"6fe35cd4-e497-4c2f-99f8-55304c5d108f\") " pod="kuadrant-system/authorino-operator-7587b89b76-f7wmw" Apr 16 20:20:11.524096 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:11.524035 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-f7wmw" Apr 16 20:20:11.652938 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:11.652856 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-f7wmw"] Apr 16 20:20:11.655124 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:20:11.655098 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fe35cd4_e497_4c2f_99f8_55304c5d108f.slice/crio-fcd67d51bd05ab1037ed867d75e2e8216e61b70882b512822650f283da777638 WatchSource:0}: Error finding container fcd67d51bd05ab1037ed867d75e2e8216e61b70882b512822650f283da777638: Status 404 returned error can't find the container with id fcd67d51bd05ab1037ed867d75e2e8216e61b70882b512822650f283da777638 Apr 16 20:20:12.411917 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:12.411878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-f7wmw" event={"ID":"6fe35cd4-e497-4c2f-99f8-55304c5d108f","Type":"ContainerStarted","Data":"fcd67d51bd05ab1037ed867d75e2e8216e61b70882b512822650f283da777638"} Apr 16 20:20:13.351424 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:13.351390 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9"] Apr 16 20:20:13.359803 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:13.359774 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9" Apr 16 20:20:13.362744 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:13.362506 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-54sq8\"" Apr 16 20:20:13.365210 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:13.365183 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9"] Apr 16 20:20:13.433863 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:13.433827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tgz5\" (UniqueName: \"kubernetes.io/projected/cf1453ee-5268-4494-8172-413ee64649c8-kube-api-access-6tgz5\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rkbk9\" (UID: \"cf1453ee-5268-4494-8172-413ee64649c8\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9" Apr 16 20:20:13.535222 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:13.535184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tgz5\" (UniqueName: \"kubernetes.io/projected/cf1453ee-5268-4494-8172-413ee64649c8-kube-api-access-6tgz5\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rkbk9\" (UID: \"cf1453ee-5268-4494-8172-413ee64649c8\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9" Apr 16 20:20:13.545029 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:13.544967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tgz5\" (UniqueName: \"kubernetes.io/projected/cf1453ee-5268-4494-8172-413ee64649c8-kube-api-access-6tgz5\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rkbk9\" (UID: \"cf1453ee-5268-4494-8172-413ee64649c8\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9" Apr 16 20:20:13.673351 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:13.673277 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9" Apr 16 20:20:14.168083 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:14.168061 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9"] Apr 16 20:20:14.170821 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:20:14.170795 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf1453ee_5268_4494_8172_413ee64649c8.slice/crio-7f867ff5bff3f2fdd5532b6a9ac65fd7ad58d51b3b7ce2bbe3f03f5557c196c6 WatchSource:0}: Error finding container 7f867ff5bff3f2fdd5532b6a9ac65fd7ad58d51b3b7ce2bbe3f03f5557c196c6: Status 404 returned error can't find the container with id 7f867ff5bff3f2fdd5532b6a9ac65fd7ad58d51b3b7ce2bbe3f03f5557c196c6 Apr 16 20:20:14.421611 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:14.421516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-f7wmw" event={"ID":"6fe35cd4-e497-4c2f-99f8-55304c5d108f","Type":"ContainerStarted","Data":"ae370619b781e56e64410d39cff79308257f46fa718a46c06b159e4ab7888b11"} Apr 16 20:20:14.421775 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:14.421621 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-f7wmw" Apr 16 20:20:14.422602 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:14.422578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9" event={"ID":"cf1453ee-5268-4494-8172-413ee64649c8","Type":"ContainerStarted","Data":"7f867ff5bff3f2fdd5532b6a9ac65fd7ad58d51b3b7ce2bbe3f03f5557c196c6"} Apr 16 20:20:14.447081 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:14.447025 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-f7wmw" podStartSLOduration=0.993611627 podStartE2EDuration="3.447006311s" podCreationTimestamp="2026-04-16 20:20:11 +0000 UTC" firstStartedPulling="2026-04-16 20:20:11.657230003 +0000 UTC m=+512.191585708" lastFinishedPulling="2026-04-16 20:20:14.110624675 +0000 UTC m=+514.644980392" observedRunningTime="2026-04-16 20:20:14.443926955 +0000 UTC m=+514.978282681" watchObservedRunningTime="2026-04-16 20:20:14.447006311 +0000 UTC m=+514.981362037" Apr 16 20:20:17.434029 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:17.433970 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9" event={"ID":"cf1453ee-5268-4494-8172-413ee64649c8","Type":"ContainerStarted","Data":"13064e0cc6cc9604ab7f235bf78259034b1e1c1c33bda4cc56be821d1be49f26"} Apr 16 20:20:17.434472 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:17.434171 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9" Apr 16 20:20:17.465745 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:17.465442 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9" podStartSLOduration=2.142419242 podStartE2EDuration="4.465425267s" podCreationTimestamp="2026-04-16 20:20:13 +0000 UTC" firstStartedPulling="2026-04-16 20:20:14.172811334 +0000 UTC m=+514.707167039" lastFinishedPulling="2026-04-16 20:20:16.495817357 +0000 UTC m=+517.030173064" observedRunningTime="2026-04-16 20:20:17.464142904 +0000 UTC m=+517.998498655" watchObservedRunningTime="2026-04-16 20:20:17.465425267 +0000 UTC m=+517.999780996" Apr 16 20:20:25.430063 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:25.429974 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-f7wmw" Apr 16 20:20:28.440224 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:20:28.440200 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rkbk9" Apr 16 20:21:40.225523 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.225492 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk"] Apr 16 20:21:40.235236 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.235212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.258586 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.258564 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk"] Apr 16 20:21:40.339896 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.339870 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d7ec2a16-7abe-4726-8c1e-55762256be38-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.340038 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.339904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d7ec2a16-7abe-4726-8c1e-55762256be38-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.340038 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.339932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d7ec2a16-7abe-4726-8c1e-55762256be38-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.340117 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.340059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d7ec2a16-7abe-4726-8c1e-55762256be38-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.340117 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.340101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d7ec2a16-7abe-4726-8c1e-55762256be38-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.340192 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.340163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d7ec2a16-7abe-4726-8c1e-55762256be38-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.340228 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.340191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64k4q\" (UniqueName: \"kubernetes.io/projected/d7ec2a16-7abe-4726-8c1e-55762256be38-kube-api-access-64k4q\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.441493 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.441466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d7ec2a16-7abe-4726-8c1e-55762256be38-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.441616 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.441497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64k4q\" (UniqueName: \"kubernetes.io/projected/d7ec2a16-7abe-4726-8c1e-55762256be38-kube-api-access-64k4q\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.441616 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.441525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d7ec2a16-7abe-4726-8c1e-55762256be38-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.441616 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.441551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d7ec2a16-7abe-4726-8c1e-55762256be38-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.441616 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.441590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d7ec2a16-7abe-4726-8c1e-55762256be38-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.441821 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.441623 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d7ec2a16-7abe-4726-8c1e-55762256be38-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.441821 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.441655 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d7ec2a16-7abe-4726-8c1e-55762256be38-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.442564 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.442527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d7ec2a16-7abe-4726-8c1e-55762256be38-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.444481 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.444458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d7ec2a16-7abe-4726-8c1e-55762256be38-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.444723 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.444704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d7ec2a16-7abe-4726-8c1e-55762256be38-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.444803 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.444766 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d7ec2a16-7abe-4726-8c1e-55762256be38-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.444882 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.444860 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d7ec2a16-7abe-4726-8c1e-55762256be38-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.451812 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.450793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d7ec2a16-7abe-4726-8c1e-55762256be38-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.454289 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.452584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64k4q\" (UniqueName: \"kubernetes.io/projected/d7ec2a16-7abe-4726-8c1e-55762256be38-kube-api-access-64k4q\") pod \"istiod-openshift-gateway-55ff986f96-qz9fk\" (UID: \"d7ec2a16-7abe-4726-8c1e-55762256be38\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.544953 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.544893 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:40.685507 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.685482 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk"] Apr 16 20:21:40.687753 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:21:40.687725 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7ec2a16_7abe_4726_8c1e_55762256be38.slice/crio-af8e636040c923564044a88de8273f88af5bed5f1832a07ae7c966cb0fec5467 WatchSource:0}: Error finding container af8e636040c923564044a88de8273f88af5bed5f1832a07ae7c966cb0fec5467: Status 404 returned error can't find the container with id af8e636040c923564044a88de8273f88af5bed5f1832a07ae7c966cb0fec5467 Apr 16 20:21:40.689757 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.689720 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:21:40.689831 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:40.689795 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:21:41.692341 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:41.692300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" event={"ID":"d7ec2a16-7abe-4726-8c1e-55762256be38","Type":"ContainerStarted","Data":"319a217f9e8f30f3b53657635a2df488c01d3fec2425dc64537dfad1cdc713fb"} Apr 16 20:21:41.692341 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:41.692343 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" event={"ID":"d7ec2a16-7abe-4726-8c1e-55762256be38","Type":"ContainerStarted","Data":"af8e636040c923564044a88de8273f88af5bed5f1832a07ae7c966cb0fec5467"} Apr 16 20:21:41.692852 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:41.692557 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:41.694389 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:41.694362 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" Apr 16 20:21:41.737880 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:41.737838 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qz9fk" podStartSLOduration=1.737824375 podStartE2EDuration="1.737824375s" podCreationTimestamp="2026-04-16 20:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:21:41.736083498 +0000 UTC m=+602.270439224" watchObservedRunningTime="2026-04-16 20:21:41.737824375 +0000 UTC m=+602.272180099" Apr 16 20:21:41.790488 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:41.790462 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7"] Apr 16 20:21:41.790769 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:41.790743 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" podUID="c538862c-e47e-4237-abb4-584ddf633e15" containerName="discovery" containerID="cri-o://2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9" gracePeriod=30 Apr 16 20:21:42.032463 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.032443 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:21:42.155738 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.155706 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-cacerts\") pod \"c538862c-e47e-4237-abb4-584ddf633e15\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " Apr 16 20:21:42.155738 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.155742 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-istio-kubeconfig\") pod \"c538862c-e47e-4237-abb4-584ddf633e15\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " Apr 16 20:21:42.155946 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.155794 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwz6s\" (UniqueName: \"kubernetes.io/projected/c538862c-e47e-4237-abb4-584ddf633e15-kube-api-access-wwz6s\") pod \"c538862c-e47e-4237-abb4-584ddf633e15\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " Apr 16 20:21:42.155946 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.155909 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c538862c-e47e-4237-abb4-584ddf633e15-istio-token\") pod \"c538862c-e47e-4237-abb4-584ddf633e15\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " Apr 16 20:21:42.156083 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.155953 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-istio-csr-dns-cert\") pod \"c538862c-e47e-4237-abb4-584ddf633e15\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " Apr 16 20:21:42.156144 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.156089 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c538862c-e47e-4237-abb4-584ddf633e15-istio-csr-ca-configmap\") pod \"c538862c-e47e-4237-abb4-584ddf633e15\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " Apr 16 20:21:42.156144 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.156137 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c538862c-e47e-4237-abb4-584ddf633e15-local-certs\") pod \"c538862c-e47e-4237-abb4-584ddf633e15\" (UID: \"c538862c-e47e-4237-abb4-584ddf633e15\") " Apr 16 20:21:42.156434 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.156402 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c538862c-e47e-4237-abb4-584ddf633e15-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "c538862c-e47e-4237-abb4-584ddf633e15" (UID: "c538862c-e47e-4237-abb4-584ddf633e15"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:21:42.158194 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.158166 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c538862c-e47e-4237-abb4-584ddf633e15-istio-token" (OuterVolumeSpecName: "istio-token") pod "c538862c-e47e-4237-abb4-584ddf633e15" (UID: "c538862c-e47e-4237-abb4-584ddf633e15"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:21:42.158291 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.158258 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-cacerts" (OuterVolumeSpecName: "cacerts") pod "c538862c-e47e-4237-abb4-584ddf633e15" (UID: "c538862c-e47e-4237-abb4-584ddf633e15"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:21:42.158291 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.158268 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "c538862c-e47e-4237-abb4-584ddf633e15" (UID: "c538862c-e47e-4237-abb4-584ddf633e15"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:21:42.158408 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.158382 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "c538862c-e47e-4237-abb4-584ddf633e15" (UID: "c538862c-e47e-4237-abb4-584ddf633e15"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:21:42.158539 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.158520 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c538862c-e47e-4237-abb4-584ddf633e15-local-certs" (OuterVolumeSpecName: "local-certs") pod "c538862c-e47e-4237-abb4-584ddf633e15" (UID: "c538862c-e47e-4237-abb4-584ddf633e15"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:21:42.158711 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.158697 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c538862c-e47e-4237-abb4-584ddf633e15-kube-api-access-wwz6s" (OuterVolumeSpecName: "kube-api-access-wwz6s") pod "c538862c-e47e-4237-abb4-584ddf633e15" (UID: "c538862c-e47e-4237-abb4-584ddf633e15"). InnerVolumeSpecName "kube-api-access-wwz6s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:21:42.257512 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.257444 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wwz6s\" (UniqueName: \"kubernetes.io/projected/c538862c-e47e-4237-abb4-584ddf633e15-kube-api-access-wwz6s\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:21:42.257512 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.257469 2576 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c538862c-e47e-4237-abb4-584ddf633e15-istio-token\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:21:42.257512 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.257482 2576 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-istio-csr-dns-cert\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:21:42.257512 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.257490 2576 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c538862c-e47e-4237-abb4-584ddf633e15-istio-csr-ca-configmap\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:21:42.257512 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.257499 2576 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c538862c-e47e-4237-abb4-584ddf633e15-local-certs\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:21:42.257512 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.257508 2576 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-cacerts\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:21:42.257512 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.257516 2576 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c538862c-e47e-4237-abb4-584ddf633e15-istio-kubeconfig\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:21:42.696903 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.696823 2576 generic.go:358] "Generic (PLEG): container finished" podID="c538862c-e47e-4237-abb4-584ddf633e15" containerID="2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9" exitCode=0 Apr 16 20:21:42.696903 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.696886 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" Apr 16 20:21:42.697361 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.696911 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" event={"ID":"c538862c-e47e-4237-abb4-584ddf633e15","Type":"ContainerDied","Data":"2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9"} Apr 16 20:21:42.697361 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.696958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7" event={"ID":"c538862c-e47e-4237-abb4-584ddf633e15","Type":"ContainerDied","Data":"f1fd9b4e2d67afab2ddaf3f89058898cc9cc7554e9edce0a05e831c4c551a688"} Apr 16 20:21:42.697361 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.696993 2576 scope.go:117] "RemoveContainer" containerID="2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9" Apr 16 20:21:42.705224 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.705167 2576 scope.go:117] "RemoveContainer" containerID="2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9" Apr 16 20:21:42.707841 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:21:42.707808 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9\": container with ID starting with 2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9 not found: ID does not exist" containerID="2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9" Apr 16 20:21:42.707962 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.707858 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9"} err="failed to get container status \"2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9\": rpc error: code = NotFound desc = could not find container \"2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9\": container with ID starting with 2182db05c8f93066883b17b7ab14f82cd7075d36897649ed5f11a417f460eca9 not found: ID does not exist" Apr 16 20:21:42.737093 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.737065 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7"] Apr 16 20:21:42.739175 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:42.739155 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w9zn7"] Apr 16 20:21:44.059146 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:44.059106 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c538862c-e47e-4237-abb4-584ddf633e15" path="/var/lib/kubelet/pods/c538862c-e47e-4237-abb4-584ddf633e15/volumes" Apr 16 20:21:49.637292 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.637260 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-b8wr9"] Apr 16 20:21:49.637718 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.637545 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c538862c-e47e-4237-abb4-584ddf633e15" containerName="discovery" Apr 16 20:21:49.637718 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.637556 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c538862c-e47e-4237-abb4-584ddf633e15" containerName="discovery" Apr 16 20:21:49.637718 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.637614 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c538862c-e47e-4237-abb4-584ddf633e15" containerName="discovery" Apr 16 20:21:49.641860 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.641843 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" Apr 16 20:21:49.644112 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.644090 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 20:21:49.644211 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.644143 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 20:21:49.644830 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.644804 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-l4mnt\"" Apr 16 20:21:49.644910 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.644842 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 20:21:49.647016 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.646972 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-b8wr9"] Apr 16 20:21:49.809970 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.809942 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5-cert\") pod \"kserve-controller-manager-66cf78b85b-b8wr9\" (UID: \"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5\") " pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" Apr 16 20:21:49.810132 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.809975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz9f7\" (UniqueName: \"kubernetes.io/projected/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5-kube-api-access-pz9f7\") pod \"kserve-controller-manager-66cf78b85b-b8wr9\" (UID: \"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5\") " pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" Apr 16 20:21:49.910598 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.910538 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pz9f7\" (UniqueName: \"kubernetes.io/projected/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5-kube-api-access-pz9f7\") pod \"kserve-controller-manager-66cf78b85b-b8wr9\" (UID: \"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5\") " pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" Apr 16 20:21:49.910705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.910605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5-cert\") pod \"kserve-controller-manager-66cf78b85b-b8wr9\" (UID: \"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5\") " pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" Apr 16 20:21:49.912795 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.912769 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5-cert\") pod \"kserve-controller-manager-66cf78b85b-b8wr9\" (UID: \"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5\") " pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" Apr 16 20:21:49.920912 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.920887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz9f7\" (UniqueName: \"kubernetes.io/projected/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5-kube-api-access-pz9f7\") pod \"kserve-controller-manager-66cf78b85b-b8wr9\" (UID: \"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5\") " pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" Apr 16 20:21:49.952604 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:49.952581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" Apr 16 20:21:50.072673 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:50.072514 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-b8wr9"] Apr 16 20:21:50.075518 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:21:50.075490 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e69f8d8_a444_4bd7_af15_0dd63c7fc8c5.slice/crio-034a3f495478bf3c973e3ca4228e53a06465e154ba6ed16f988079196aa53ba3 WatchSource:0}: Error finding container 034a3f495478bf3c973e3ca4228e53a06465e154ba6ed16f988079196aa53ba3: Status 404 returned error can't find the container with id 034a3f495478bf3c973e3ca4228e53a06465e154ba6ed16f988079196aa53ba3 Apr 16 20:21:50.723314 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:50.723211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" event={"ID":"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5","Type":"ContainerStarted","Data":"034a3f495478bf3c973e3ca4228e53a06465e154ba6ed16f988079196aa53ba3"} Apr 16 20:21:52.732517 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:52.732424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" event={"ID":"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5","Type":"ContainerStarted","Data":"ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3"} Apr 16 20:21:52.732938 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:52.732591 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" Apr 16 20:21:52.750622 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:21:52.750575 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" podStartSLOduration=1.345188859 podStartE2EDuration="3.75056318s" podCreationTimestamp="2026-04-16 20:21:49 +0000 UTC" firstStartedPulling="2026-04-16 20:21:50.076795575 +0000 UTC m=+610.611151279" lastFinishedPulling="2026-04-16 20:21:52.482169895 +0000 UTC m=+613.016525600" observedRunningTime="2026-04-16 20:21:52.748337933 +0000 UTC m=+613.282693659" watchObservedRunningTime="2026-04-16 20:21:52.75056318 +0000 UTC m=+613.284918907" Apr 16 20:22:23.740150 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:23.740123 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" Apr 16 20:22:27.455078 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.455045 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-b8wr9"] Apr 16 20:22:27.455483 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.455239 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" podUID="4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5" containerName="manager" containerID="cri-o://ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3" gracePeriod=10 Apr 16 20:22:27.480249 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.480217 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-8bftp"] Apr 16 20:22:27.483326 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.483306 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" Apr 16 20:22:27.490866 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.490844 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-8bftp"] Apr 16 20:22:27.574506 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.574483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93022683-4178-4a33-a1a4-6a94e6de288a-cert\") pod \"kserve-controller-manager-66cf78b85b-8bftp\" (UID: \"93022683-4178-4a33-a1a4-6a94e6de288a\") " pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" Apr 16 20:22:27.574633 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.574529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csk8q\" (UniqueName: \"kubernetes.io/projected/93022683-4178-4a33-a1a4-6a94e6de288a-kube-api-access-csk8q\") pod \"kserve-controller-manager-66cf78b85b-8bftp\" (UID: \"93022683-4178-4a33-a1a4-6a94e6de288a\") " pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" Apr 16 20:22:27.675700 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.675671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93022683-4178-4a33-a1a4-6a94e6de288a-cert\") pod \"kserve-controller-manager-66cf78b85b-8bftp\" (UID: \"93022683-4178-4a33-a1a4-6a94e6de288a\") " pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" Apr 16 20:22:27.675838 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.675754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csk8q\" (UniqueName: \"kubernetes.io/projected/93022683-4178-4a33-a1a4-6a94e6de288a-kube-api-access-csk8q\") pod \"kserve-controller-manager-66cf78b85b-8bftp\" (UID: \"93022683-4178-4a33-a1a4-6a94e6de288a\") " pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" Apr 16 20:22:27.678215 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.678189 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93022683-4178-4a33-a1a4-6a94e6de288a-cert\") pod \"kserve-controller-manager-66cf78b85b-8bftp\" (UID: \"93022683-4178-4a33-a1a4-6a94e6de288a\") " pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" Apr 16 20:22:27.684253 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.684231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csk8q\" (UniqueName: \"kubernetes.io/projected/93022683-4178-4a33-a1a4-6a94e6de288a-kube-api-access-csk8q\") pod \"kserve-controller-manager-66cf78b85b-8bftp\" (UID: \"93022683-4178-4a33-a1a4-6a94e6de288a\") " pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" Apr 16 20:22:27.693840 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.693823 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" Apr 16 20:22:27.776174 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.776116 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz9f7\" (UniqueName: \"kubernetes.io/projected/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5-kube-api-access-pz9f7\") pod \"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5\" (UID: \"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5\") " Apr 16 20:22:27.776264 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.776188 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5-cert\") pod \"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5\" (UID: \"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5\") " Apr 16 20:22:27.778126 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.778096 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5-kube-api-access-pz9f7" (OuterVolumeSpecName: "kube-api-access-pz9f7") pod "4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5" (UID: "4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5"). InnerVolumeSpecName "kube-api-access-pz9f7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:22:27.778126 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.778119 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5-cert" (OuterVolumeSpecName: "cert") pod "4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5" (UID: "4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:22:27.829278 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.829256 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" Apr 16 20:22:27.847825 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.847800 2576 generic.go:358] "Generic (PLEG): container finished" podID="4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5" containerID="ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3" exitCode=0 Apr 16 20:22:27.847931 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.847834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" event={"ID":"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5","Type":"ContainerDied","Data":"ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3"} Apr 16 20:22:27.847931 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.847854 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" event={"ID":"4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5","Type":"ContainerDied","Data":"034a3f495478bf3c973e3ca4228e53a06465e154ba6ed16f988079196aa53ba3"} Apr 16 20:22:27.847931 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.847870 2576 scope.go:117] "RemoveContainer" containerID="ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3" Apr 16 20:22:27.847931 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.847906 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-b8wr9" Apr 16 20:22:27.855488 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.855448 2576 scope.go:117] "RemoveContainer" containerID="ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3" Apr 16 20:22:27.855726 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:22:27.855706 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3\": container with ID starting with ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3 not found: ID does not exist" containerID="ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3" Apr 16 20:22:27.855801 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.855731 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3"} err="failed to get container status \"ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3\": rpc error: code = NotFound desc = could not find container \"ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3\": container with ID starting with ee5adefbd4aaa2173401ee55c20449b7e726f81678291f226f0d0731e5dbc2e3 not found: ID does not exist" Apr 16 20:22:27.869330 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.869307 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-b8wr9"] Apr 16 20:22:27.874718 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.874694 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-b8wr9"] Apr 16 20:22:27.877633 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.877608 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pz9f7\" (UniqueName: \"kubernetes.io/projected/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5-kube-api-access-pz9f7\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:22:27.877790 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.877638 2576 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5-cert\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:22:27.944594 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:27.944573 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-8bftp"] Apr 16 20:22:27.946534 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:22:27.946507 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93022683_4178_4a33_a1a4_6a94e6de288a.slice/crio-5a50f0a0de0c646db1bc626a1eeb3478562ccf807409ecfdfd6e7b45910e0a23 WatchSource:0}: Error finding container 5a50f0a0de0c646db1bc626a1eeb3478562ccf807409ecfdfd6e7b45910e0a23: Status 404 returned error can't find the container with id 5a50f0a0de0c646db1bc626a1eeb3478562ccf807409ecfdfd6e7b45910e0a23 Apr 16 20:22:28.058800 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:28.058733 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5" path="/var/lib/kubelet/pods/4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5/volumes" Apr 16 20:22:28.853412 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:28.853383 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" event={"ID":"93022683-4178-4a33-a1a4-6a94e6de288a","Type":"ContainerStarted","Data":"814c8b076881f3b236de2c3a3fe6aceadd9dd948537f2a5ed35039949b720daa"} Apr 16 20:22:28.853412 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:28.853413 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" event={"ID":"93022683-4178-4a33-a1a4-6a94e6de288a","Type":"ContainerStarted","Data":"5a50f0a0de0c646db1bc626a1eeb3478562ccf807409ecfdfd6e7b45910e0a23"} Apr 16 20:22:28.853808 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:28.853499 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" Apr 16 20:22:28.868807 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:28.868767 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" podStartSLOduration=1.554755524 podStartE2EDuration="1.868756159s" podCreationTimestamp="2026-04-16 20:22:27 +0000 UTC" firstStartedPulling="2026-04-16 20:22:27.94768573 +0000 UTC m=+648.482041435" lastFinishedPulling="2026-04-16 20:22:28.261686366 +0000 UTC m=+648.796042070" observedRunningTime="2026-04-16 20:22:28.867376553 +0000 UTC m=+649.401732279" watchObservedRunningTime="2026-04-16 20:22:28.868756159 +0000 UTC m=+649.403111884" Apr 16 20:22:59.861234 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:22:59.861206 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-8bftp" Apr 16 20:23:00.795285 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:00.795255 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-9ghn5"] Apr 16 20:23:00.795637 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:00.795620 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5" containerName="manager" Apr 16 20:23:00.795721 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:00.795640 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5" containerName="manager" Apr 16 20:23:00.795721 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:00.795703 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e69f8d8-a444-4bd7-af15-0dd63c7fc8c5" containerName="manager" Apr 16 20:23:00.798715 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:00.798692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-9ghn5" Apr 16 20:23:00.800971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:00.800950 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 20:23:00.800971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:00.801024 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-t65kr\"" Apr 16 20:23:00.807518 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:00.807497 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-9ghn5"] Apr 16 20:23:00.898368 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:00.898332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7g7k\" (UniqueName: \"kubernetes.io/projected/99f4da0f-2530-4202-bffc-0147d7fb0a74-kube-api-access-t7g7k\") pod \"odh-model-controller-696fc77849-9ghn5\" (UID: \"99f4da0f-2530-4202-bffc-0147d7fb0a74\") " pod="kserve/odh-model-controller-696fc77849-9ghn5" Apr 16 20:23:00.898368 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:00.898370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99f4da0f-2530-4202-bffc-0147d7fb0a74-cert\") pod \"odh-model-controller-696fc77849-9ghn5\" (UID: \"99f4da0f-2530-4202-bffc-0147d7fb0a74\") " pod="kserve/odh-model-controller-696fc77849-9ghn5" Apr 16 20:23:00.999123 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:00.999089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7g7k\" (UniqueName: \"kubernetes.io/projected/99f4da0f-2530-4202-bffc-0147d7fb0a74-kube-api-access-t7g7k\") pod \"odh-model-controller-696fc77849-9ghn5\" (UID: \"99f4da0f-2530-4202-bffc-0147d7fb0a74\") " pod="kserve/odh-model-controller-696fc77849-9ghn5" Apr 16 20:23:00.999297 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:00.999127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99f4da0f-2530-4202-bffc-0147d7fb0a74-cert\") pod \"odh-model-controller-696fc77849-9ghn5\" (UID: \"99f4da0f-2530-4202-bffc-0147d7fb0a74\") " pod="kserve/odh-model-controller-696fc77849-9ghn5" Apr 16 20:23:01.001438 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:01.001412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99f4da0f-2530-4202-bffc-0147d7fb0a74-cert\") pod \"odh-model-controller-696fc77849-9ghn5\" (UID: \"99f4da0f-2530-4202-bffc-0147d7fb0a74\") " pod="kserve/odh-model-controller-696fc77849-9ghn5" Apr 16 20:23:01.013039 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:01.013014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7g7k\" (UniqueName: \"kubernetes.io/projected/99f4da0f-2530-4202-bffc-0147d7fb0a74-kube-api-access-t7g7k\") pod \"odh-model-controller-696fc77849-9ghn5\" (UID: \"99f4da0f-2530-4202-bffc-0147d7fb0a74\") " pod="kserve/odh-model-controller-696fc77849-9ghn5" Apr 16 20:23:01.109124 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:01.109055 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-9ghn5" Apr 16 20:23:01.222957 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:01.222863 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-9ghn5"] Apr 16 20:23:01.227600 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:23:01.227573 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99f4da0f_2530_4202_bffc_0147d7fb0a74.slice/crio-4bb5244749724818a05bcc622d14eced8db36402c8cba3ad79d89bcdcb59726f WatchSource:0}: Error finding container 4bb5244749724818a05bcc622d14eced8db36402c8cba3ad79d89bcdcb59726f: Status 404 returned error can't find the container with id 4bb5244749724818a05bcc622d14eced8db36402c8cba3ad79d89bcdcb59726f Apr 16 20:23:01.228717 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:01.228700 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:23:01.953217 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:01.953177 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-9ghn5" event={"ID":"99f4da0f-2530-4202-bffc-0147d7fb0a74","Type":"ContainerStarted","Data":"4bb5244749724818a05bcc622d14eced8db36402c8cba3ad79d89bcdcb59726f"} Apr 16 20:23:03.961534 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:03.961495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-9ghn5" event={"ID":"99f4da0f-2530-4202-bffc-0147d7fb0a74","Type":"ContainerStarted","Data":"b7fc9288cff81806f352f5242980f7652747101803217db6474477af3b938455"} Apr 16 20:23:03.961880 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:03.961616 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-9ghn5" Apr 16 20:23:03.977647 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:03.977604 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-9ghn5" podStartSLOduration=1.5674896550000001 podStartE2EDuration="3.977591888s" podCreationTimestamp="2026-04-16 20:23:00 +0000 UTC" firstStartedPulling="2026-04-16 20:23:01.228816252 +0000 UTC m=+681.763171957" lastFinishedPulling="2026-04-16 20:23:03.638918471 +0000 UTC m=+684.173274190" observedRunningTime="2026-04-16 20:23:03.975813605 +0000 UTC m=+684.510169331" watchObservedRunningTime="2026-04-16 20:23:03.977591888 +0000 UTC m=+684.511947646" Apr 16 20:23:14.967581 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:14.967547 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-9ghn5" Apr 16 20:23:36.608405 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.608209 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf"] Apr 16 20:23:36.612475 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.612450 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.615474 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.615443 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-6qv9c\"" Apr 16 20:23:36.615707 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.615689 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 20:23:36.615913 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.615896 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:23:36.616149 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.616132 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:23:36.624184 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.624161 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf"] Apr 16 20:23:36.768136 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.768101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.768341 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.768142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/89fab457-9788-4391-ac81-da6b2b19ffdd-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.768341 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.768173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.768341 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.768197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msx6t\" (UniqueName: \"kubernetes.io/projected/89fab457-9788-4391-ac81-da6b2b19ffdd-kube-api-access-msx6t\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.768341 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.768222 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.768341 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.768288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/89fab457-9788-4391-ac81-da6b2b19ffdd-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.768341 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.768329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.768704 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.768374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.768704 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.768392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/89fab457-9788-4391-ac81-da6b2b19ffdd-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.869681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.869576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/89fab457-9788-4391-ac81-da6b2b19ffdd-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.869681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.869651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.869918 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.869700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.869918 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.869726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/89fab457-9788-4391-ac81-da6b2b19ffdd-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.869918 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.869758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.869918 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.869785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/89fab457-9788-4391-ac81-da6b2b19ffdd-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.869918 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.869813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.869918 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.869837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msx6t\" (UniqueName: \"kubernetes.io/projected/89fab457-9788-4391-ac81-da6b2b19ffdd-kube-api-access-msx6t\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.869918 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.869870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.870288 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.870176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.870340 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.870286 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.870393 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.870348 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.870457 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.870433 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.870840 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.870822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/89fab457-9788-4391-ac81-da6b2b19ffdd-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.872762 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.872740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/89fab457-9788-4391-ac81-da6b2b19ffdd-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.872955 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.872937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/89fab457-9788-4391-ac81-da6b2b19ffdd-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.878806 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.878768 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/89fab457-9788-4391-ac81-da6b2b19ffdd-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.879103 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.879068 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msx6t\" (UniqueName: \"kubernetes.io/projected/89fab457-9788-4391-ac81-da6b2b19ffdd-kube-api-access-msx6t\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ctvvf\" (UID: \"89fab457-9788-4391-ac81-da6b2b19ffdd\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:36.925967 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:36.925930 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:37.057829 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:37.057795 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf"] Apr 16 20:23:37.059936 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:23:37.059903 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89fab457_9788_4391_ac81_da6b2b19ffdd.slice/crio-bded365799134e601e07c807f80a31c8751127e42475e14c0a2d7192b3449aaf WatchSource:0}: Error finding container bded365799134e601e07c807f80a31c8751127e42475e14c0a2d7192b3449aaf: Status 404 returned error can't find the container with id bded365799134e601e07c807f80a31c8751127e42475e14c0a2d7192b3449aaf Apr 16 20:23:37.061972 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:37.061939 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:23:37.062065 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:37.062024 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:23:37.062065 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:37.062052 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:23:38.070320 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:38.070277 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" event={"ID":"89fab457-9788-4391-ac81-da6b2b19ffdd","Type":"ContainerStarted","Data":"88e012a9549b2d194202f31b300766ff31ab812fb40eee10dd0c09b8ae36a59c"} Apr 16 20:23:38.070320 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:38.070312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" event={"ID":"89fab457-9788-4391-ac81-da6b2b19ffdd","Type":"ContainerStarted","Data":"bded365799134e601e07c807f80a31c8751127e42475e14c0a2d7192b3449aaf"} Apr 16 20:23:38.090225 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:38.090177 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" podStartSLOduration=2.090160642 podStartE2EDuration="2.090160642s" podCreationTimestamp="2026-04-16 20:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:23:38.088200774 +0000 UTC m=+718.622556511" watchObservedRunningTime="2026-04-16 20:23:38.090160642 +0000 UTC m=+718.624516369" Apr 16 20:23:38.926999 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:38.926936 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:38.931806 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:38.931779 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:39.073624 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:39.073593 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:39.074796 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:39.074779 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ctvvf" Apr 16 20:23:40.059339 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.059300 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m"] Apr 16 20:23:40.064208 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.064180 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.067150 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.067128 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:23:40.067150 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.067141 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-bv9nn\"" Apr 16 20:23:40.067681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.067665 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 20:23:40.071674 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.071652 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m"] Apr 16 20:23:40.098875 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.098846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56f1666b-f1e1-4474-9084-dde4153fc15d-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.099315 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.098882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.099315 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.098907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k565\" (UniqueName: \"kubernetes.io/projected/56f1666b-f1e1-4474-9084-dde4153fc15d-kube-api-access-9k565\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.099315 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.099093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.099315 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.099178 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.099315 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.099196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.199740 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.199649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.199895 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.199829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.199895 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.199864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.200037 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.199903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56f1666b-f1e1-4474-9084-dde4153fc15d-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.200037 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.199939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.200037 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.199963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k565\" (UniqueName: \"kubernetes.io/projected/56f1666b-f1e1-4474-9084-dde4153fc15d-kube-api-access-9k565\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.200186 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.200147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.200268 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.200245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.200542 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.200502 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.200657 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.200608 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.202645 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.202625 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56f1666b-f1e1-4474-9084-dde4153fc15d-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.207703 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.207680 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k565\" (UniqueName: \"kubernetes.io/projected/56f1666b-f1e1-4474-9084-dde4153fc15d-kube-api-access-9k565\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.376031 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.375997 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:23:40.503737 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:40.503711 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m"] Apr 16 20:23:40.505824 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:23:40.505796 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56f1666b_f1e1_4474_9084_dde4153fc15d.slice/crio-6b2eca9ab21ff1fd6ff7b596d6827d5249eb7e0ccdde309335177d1c013c3e15 WatchSource:0}: Error finding container 6b2eca9ab21ff1fd6ff7b596d6827d5249eb7e0ccdde309335177d1c013c3e15: Status 404 returned error can't find the container with id 6b2eca9ab21ff1fd6ff7b596d6827d5249eb7e0ccdde309335177d1c013c3e15 Apr 16 20:23:41.081720 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:41.081676 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" event={"ID":"56f1666b-f1e1-4474-9084-dde4153fc15d","Type":"ContainerStarted","Data":"6b2eca9ab21ff1fd6ff7b596d6827d5249eb7e0ccdde309335177d1c013c3e15"} Apr 16 20:23:44.091761 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:44.091679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" event={"ID":"56f1666b-f1e1-4474-9084-dde4153fc15d","Type":"ContainerStarted","Data":"1f5fe27ef8d3e117e73e3641674d0964caa6d111f915be8cafdd95c9e732eaa2"} Apr 16 20:23:45.095845 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:45.095809 2576 generic.go:358] "Generic (PLEG): container finished" podID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerID="1f5fe27ef8d3e117e73e3641674d0964caa6d111f915be8cafdd95c9e732eaa2" exitCode=0 Apr 16 20:23:45.096297 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:45.095887 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" event={"ID":"56f1666b-f1e1-4474-9084-dde4153fc15d","Type":"ContainerDied","Data":"1f5fe27ef8d3e117e73e3641674d0964caa6d111f915be8cafdd95c9e732eaa2"} Apr 16 20:23:47.124970 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:23:47.124930 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" event={"ID":"56f1666b-f1e1-4474-9084-dde4153fc15d","Type":"ContainerStarted","Data":"3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8"} Apr 16 20:24:16.238285 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:16.238245 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" event={"ID":"56f1666b-f1e1-4474-9084-dde4153fc15d","Type":"ContainerStarted","Data":"c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b"} Apr 16 20:24:16.238683 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:16.238385 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:24:16.240638 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:16.240618 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:24:16.264972 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:16.264927 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" podStartSLOduration=0.682887494 podStartE2EDuration="36.26491534s" podCreationTimestamp="2026-04-16 20:23:40 +0000 UTC" firstStartedPulling="2026-04-16 20:23:40.50813434 +0000 UTC m=+721.042490047" lastFinishedPulling="2026-04-16 20:24:16.090162185 +0000 UTC m=+756.624517893" observedRunningTime="2026-04-16 20:24:16.262626639 +0000 UTC m=+756.796982387" watchObservedRunningTime="2026-04-16 20:24:16.26491534 +0000 UTC m=+756.799271065" Apr 16 20:24:20.376456 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:20.376423 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:24:20.376456 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:20.376461 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:24:30.377854 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:30.377820 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:24:30.379008 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:30.378989 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:24:31.811772 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:31.811740 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m"] Apr 16 20:24:32.289255 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:32.289194 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" podUID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerName="main" containerID="cri-o://3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8" gracePeriod=30 Apr 16 20:24:32.289433 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:32.289223 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" podUID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerName="tokenizer" containerID="cri-o://c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b" gracePeriod=30 Apr 16 20:24:33.293918 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.293888 2576 generic.go:358] "Generic (PLEG): container finished" podID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerID="3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8" exitCode=0 Apr 16 20:24:33.294257 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.293962 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" event={"ID":"56f1666b-f1e1-4474-9084-dde4153fc15d","Type":"ContainerDied","Data":"3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8"} Apr 16 20:24:33.424395 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.424368 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:24:33.553092 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.553061 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56f1666b-f1e1-4474-9084-dde4153fc15d-tls-certs\") pod \"56f1666b-f1e1-4474-9084-dde4153fc15d\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " Apr 16 20:24:33.553241 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.553106 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k565\" (UniqueName: \"kubernetes.io/projected/56f1666b-f1e1-4474-9084-dde4153fc15d-kube-api-access-9k565\") pod \"56f1666b-f1e1-4474-9084-dde4153fc15d\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " Apr 16 20:24:33.553241 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.553151 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-kserve-provision-location\") pod \"56f1666b-f1e1-4474-9084-dde4153fc15d\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " Apr 16 20:24:33.553241 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.553181 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-tmp\") pod \"56f1666b-f1e1-4474-9084-dde4153fc15d\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " Apr 16 20:24:33.553241 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.553202 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-uds\") pod \"56f1666b-f1e1-4474-9084-dde4153fc15d\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " Apr 16 20:24:33.553241 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.553237 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-cache\") pod \"56f1666b-f1e1-4474-9084-dde4153fc15d\" (UID: \"56f1666b-f1e1-4474-9084-dde4153fc15d\") " Apr 16 20:24:33.553557 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.553507 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "56f1666b-f1e1-4474-9084-dde4153fc15d" (UID: "56f1666b-f1e1-4474-9084-dde4153fc15d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:33.553620 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.553531 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "56f1666b-f1e1-4474-9084-dde4153fc15d" (UID: "56f1666b-f1e1-4474-9084-dde4153fc15d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:33.553620 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.553576 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "56f1666b-f1e1-4474-9084-dde4153fc15d" (UID: "56f1666b-f1e1-4474-9084-dde4153fc15d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:33.553859 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.553838 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "56f1666b-f1e1-4474-9084-dde4153fc15d" (UID: "56f1666b-f1e1-4474-9084-dde4153fc15d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:33.555222 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.555204 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f1666b-f1e1-4474-9084-dde4153fc15d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "56f1666b-f1e1-4474-9084-dde4153fc15d" (UID: "56f1666b-f1e1-4474-9084-dde4153fc15d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:24:33.555328 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.555265 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f1666b-f1e1-4474-9084-dde4153fc15d-kube-api-access-9k565" (OuterVolumeSpecName: "kube-api-access-9k565") pod "56f1666b-f1e1-4474-9084-dde4153fc15d" (UID: "56f1666b-f1e1-4474-9084-dde4153fc15d"). InnerVolumeSpecName "kube-api-access-9k565". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:24:33.653872 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.653851 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56f1666b-f1e1-4474-9084-dde4153fc15d-tls-certs\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:24:33.653872 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.653871 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9k565\" (UniqueName: \"kubernetes.io/projected/56f1666b-f1e1-4474-9084-dde4153fc15d-kube-api-access-9k565\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:24:33.654004 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.653881 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-kserve-provision-location\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:24:33.654004 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.653891 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-tmp\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:24:33.654004 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.653901 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-uds\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:24:33.654004 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:33.653908 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/56f1666b-f1e1-4474-9084-dde4153fc15d-tokenizer-cache\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:24:34.297985 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.297956 2576 generic.go:358] "Generic (PLEG): container finished" podID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerID="c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b" exitCode=0 Apr 16 20:24:34.298278 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.298007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" event={"ID":"56f1666b-f1e1-4474-9084-dde4153fc15d","Type":"ContainerDied","Data":"c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b"} Apr 16 20:24:34.298278 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.298042 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" event={"ID":"56f1666b-f1e1-4474-9084-dde4153fc15d","Type":"ContainerDied","Data":"6b2eca9ab21ff1fd6ff7b596d6827d5249eb7e0ccdde309335177d1c013c3e15"} Apr 16 20:24:34.298278 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.298041 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m" Apr 16 20:24:34.298278 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.298110 2576 scope.go:117] "RemoveContainer" containerID="c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b" Apr 16 20:24:34.310346 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.310165 2576 scope.go:117] "RemoveContainer" containerID="3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8" Apr 16 20:24:34.317040 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.317018 2576 scope.go:117] "RemoveContainer" containerID="1f5fe27ef8d3e117e73e3641674d0964caa6d111f915be8cafdd95c9e732eaa2" Apr 16 20:24:34.321558 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.321534 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m"] Apr 16 20:24:34.324185 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.324165 2576 scope.go:117] "RemoveContainer" containerID="c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b" Apr 16 20:24:34.324440 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:24:34.324417 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b\": container with ID starting with c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b not found: ID does not exist" containerID="c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b" Apr 16 20:24:34.324493 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.324448 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b"} err="failed to get container status \"c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b\": rpc error: code = NotFound desc = could not find container \"c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b\": container with ID starting with c02085a0d7f3d708a730a84b843b1846eb2466d45ea643bef4dec50006ff954b not found: ID does not exist" Apr 16 20:24:34.324493 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.324469 2576 scope.go:117] "RemoveContainer" containerID="3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8" Apr 16 20:24:34.325153 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:24:34.324716 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8\": container with ID starting with 3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8 not found: ID does not exist" containerID="3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8" Apr 16 20:24:34.325153 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.324747 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8"} err="failed to get container status \"3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8\": rpc error: code = NotFound desc = could not find container \"3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8\": container with ID starting with 3bce55bc31b65a34ba0c309102696ae6f41b8ba90c7701d2698dda74875a2ff8 not found: ID does not exist" Apr 16 20:24:34.325153 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.324768 2576 scope.go:117] "RemoveContainer" containerID="1f5fe27ef8d3e117e73e3641674d0964caa6d111f915be8cafdd95c9e732eaa2" Apr 16 20:24:34.325547 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:24:34.325522 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5fe27ef8d3e117e73e3641674d0964caa6d111f915be8cafdd95c9e732eaa2\": container with ID starting with 1f5fe27ef8d3e117e73e3641674d0964caa6d111f915be8cafdd95c9e732eaa2 not found: ID does not exist" containerID="1f5fe27ef8d3e117e73e3641674d0964caa6d111f915be8cafdd95c9e732eaa2" Apr 16 20:24:34.325633 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.325560 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5fe27ef8d3e117e73e3641674d0964caa6d111f915be8cafdd95c9e732eaa2"} err="failed to get container status \"1f5fe27ef8d3e117e73e3641674d0964caa6d111f915be8cafdd95c9e732eaa2\": rpc error: code = NotFound desc = could not find container \"1f5fe27ef8d3e117e73e3641674d0964caa6d111f915be8cafdd95c9e732eaa2\": container with ID starting with 1f5fe27ef8d3e117e73e3641674d0964caa6d111f915be8cafdd95c9e732eaa2 not found: ID does not exist" Apr 16 20:24:34.326457 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:34.326437 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-549c8hmw7m"] Apr 16 20:24:36.058866 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:36.058836 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f1666b-f1e1-4474-9084-dde4153fc15d" path="/var/lib/kubelet/pods/56f1666b-f1e1-4474-9084-dde4153fc15d/volumes" Apr 16 20:24:38.249842 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.249812 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk"] Apr 16 20:24:38.250321 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.250291 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerName="tokenizer" Apr 16 20:24:38.250321 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.250308 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerName="tokenizer" Apr 16 20:24:38.250420 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.250332 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerName="main" Apr 16 20:24:38.250420 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.250340 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerName="main" Apr 16 20:24:38.250420 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.250357 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerName="storage-initializer" Apr 16 20:24:38.250420 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.250366 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerName="storage-initializer" Apr 16 20:24:38.250607 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.250439 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerName="main" Apr 16 20:24:38.250607 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.250450 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="56f1666b-f1e1-4474-9084-dde4153fc15d" containerName="tokenizer" Apr 16 20:24:38.255751 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.255728 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.258291 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.258273 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:24:38.259114 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.259097 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 20:24:38.259165 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.259099 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-cqssp\"" Apr 16 20:24:38.266392 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.266369 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk"] Apr 16 20:24:38.389222 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.389179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.389418 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.389250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.389418 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.389297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjdp6\" (UniqueName: \"kubernetes.io/projected/3ed75c45-c2ee-454f-a790-25274298b31d-kube-api-access-pjdp6\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.389418 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.389330 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.389418 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.389369 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.389563 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.389431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ed75c45-c2ee-454f-a790-25274298b31d-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.490709 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.490673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjdp6\" (UniqueName: \"kubernetes.io/projected/3ed75c45-c2ee-454f-a790-25274298b31d-kube-api-access-pjdp6\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.490893 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.490720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.490893 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.490748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.490893 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.490772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ed75c45-c2ee-454f-a790-25274298b31d-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.490893 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.490803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.490893 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.490838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.491217 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.491195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.491270 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.491222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.491305 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.491273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.491305 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.491281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.493176 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.493160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ed75c45-c2ee-454f-a790-25274298b31d-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.498673 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.498648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjdp6\" (UniqueName: \"kubernetes.io/projected/3ed75c45-c2ee-454f-a790-25274298b31d-kube-api-access-pjdp6\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.565470 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.565402 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:38.683460 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:38.683431 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk"] Apr 16 20:24:38.686490 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:24:38.686453 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ed75c45_c2ee_454f_a790_25274298b31d.slice/crio-98db70d87cf6f51d111f0742e81d6da58a04ceb670b75ecbc610e3379f20cb80 WatchSource:0}: Error finding container 98db70d87cf6f51d111f0742e81d6da58a04ceb670b75ecbc610e3379f20cb80: Status 404 returned error can't find the container with id 98db70d87cf6f51d111f0742e81d6da58a04ceb670b75ecbc610e3379f20cb80 Apr 16 20:24:39.314158 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:39.314121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" event={"ID":"3ed75c45-c2ee-454f-a790-25274298b31d","Type":"ContainerStarted","Data":"c0c224b9246f85a9054c328c1381092332f169d0a2b5b153c55c5eb37935a823"} Apr 16 20:24:39.314158 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:39.314158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" event={"ID":"3ed75c45-c2ee-454f-a790-25274298b31d","Type":"ContainerStarted","Data":"98db70d87cf6f51d111f0742e81d6da58a04ceb670b75ecbc610e3379f20cb80"} Apr 16 20:24:40.318484 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:40.318404 2576 generic.go:358] "Generic (PLEG): container finished" podID="3ed75c45-c2ee-454f-a790-25274298b31d" containerID="c0c224b9246f85a9054c328c1381092332f169d0a2b5b153c55c5eb37935a823" exitCode=0 Apr 16 20:24:40.318484 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:40.318466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" event={"ID":"3ed75c45-c2ee-454f-a790-25274298b31d","Type":"ContainerDied","Data":"c0c224b9246f85a9054c328c1381092332f169d0a2b5b153c55c5eb37935a823"} Apr 16 20:24:41.323407 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:41.323373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" event={"ID":"3ed75c45-c2ee-454f-a790-25274298b31d","Type":"ContainerStarted","Data":"aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65"} Apr 16 20:24:41.323776 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:41.323416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" event={"ID":"3ed75c45-c2ee-454f-a790-25274298b31d","Type":"ContainerStarted","Data":"70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64"} Apr 16 20:24:41.323776 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:41.323590 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:41.345384 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:41.345325 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" podStartSLOduration=3.345305837 podStartE2EDuration="3.345305837s" podCreationTimestamp="2026-04-16 20:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:24:41.342575833 +0000 UTC m=+781.876931586" watchObservedRunningTime="2026-04-16 20:24:41.345305837 +0000 UTC m=+781.879661566" Apr 16 20:24:48.566298 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:48.566255 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:48.566815 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:48.566402 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:48.568896 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:48.568874 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:24:49.348480 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:24:49.348453 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:25:11.354080 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:11.354052 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:25:22.613282 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.613250 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8"] Apr 16 20:25:22.616793 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.616772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.619364 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.619337 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 20:25:22.619487 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.619427 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-rfs4f\"" Apr 16 20:25:22.628213 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.628191 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8"] Apr 16 20:25:22.729167 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.729138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.729278 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.729177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.729278 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.729238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbpz5\" (UniqueName: \"kubernetes.io/projected/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-kube-api-access-rbpz5\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.729347 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.729296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.729347 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.729325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.729347 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.729341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.830486 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.830459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.830608 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.830498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.830608 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.830515 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.830608 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.830552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.830608 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.830581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.830815 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.830619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbpz5\" (UniqueName: \"kubernetes.io/projected/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-kube-api-access-rbpz5\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.830915 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.830897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.830970 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.830948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.831036 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.830968 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.831094 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.831076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.833042 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.833020 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.837631 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.837610 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbpz5\" (UniqueName: \"kubernetes.io/projected/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-kube-api-access-rbpz5\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:22.927010 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:22.926936 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:23.043262 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:23.043232 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8"] Apr 16 20:25:23.046443 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:25:23.046416 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804d37ec_8773_4e3b_bfbc_6adc3e8b62d2.slice/crio-838efdf376c6848ce8b7dce9cb89c557272689152f0dd2679779a7526cb09257 WatchSource:0}: Error finding container 838efdf376c6848ce8b7dce9cb89c557272689152f0dd2679779a7526cb09257: Status 404 returned error can't find the container with id 838efdf376c6848ce8b7dce9cb89c557272689152f0dd2679779a7526cb09257 Apr 16 20:25:23.458908 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:23.458873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" event={"ID":"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2","Type":"ContainerStarted","Data":"64ff63b09f94384aac6f9ae7481c0c7f0c460f17e6876bb63a21c35552864d05"} Apr 16 20:25:23.458908 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:23.458912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" event={"ID":"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2","Type":"ContainerStarted","Data":"838efdf376c6848ce8b7dce9cb89c557272689152f0dd2679779a7526cb09257"} Apr 16 20:25:24.463448 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:24.463410 2576 generic.go:358] "Generic (PLEG): container finished" podID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerID="64ff63b09f94384aac6f9ae7481c0c7f0c460f17e6876bb63a21c35552864d05" exitCode=0 Apr 16 20:25:24.463832 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:24.463470 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" event={"ID":"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2","Type":"ContainerDied","Data":"64ff63b09f94384aac6f9ae7481c0c7f0c460f17e6876bb63a21c35552864d05"} Apr 16 20:25:25.468927 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:25.468890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" event={"ID":"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2","Type":"ContainerStarted","Data":"a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f"} Apr 16 20:25:25.468927 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:25.468932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" event={"ID":"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2","Type":"ContainerStarted","Data":"8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7"} Apr 16 20:25:25.469347 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:25.469007 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:25.489278 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:25.489234 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" podStartSLOduration=3.489221839 podStartE2EDuration="3.489221839s" podCreationTimestamp="2026-04-16 20:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:25:25.487422775 +0000 UTC m=+826.021778501" watchObservedRunningTime="2026-04-16 20:25:25.489221839 +0000 UTC m=+826.023577565" Apr 16 20:25:32.773312 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:32.773276 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk"] Apr 16 20:25:32.773706 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:32.773592 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" podUID="3ed75c45-c2ee-454f-a790-25274298b31d" containerName="main" containerID="cri-o://70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64" gracePeriod=30 Apr 16 20:25:32.773706 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:32.773636 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" podUID="3ed75c45-c2ee-454f-a790-25274298b31d" containerName="tokenizer" containerID="cri-o://aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65" gracePeriod=30 Apr 16 20:25:32.927928 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:32.927895 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:32.927928 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:32.927934 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:32.930596 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:32.930561 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:33.497579 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:33.497543 2576 generic.go:358] "Generic (PLEG): container finished" podID="3ed75c45-c2ee-454f-a790-25274298b31d" containerID="70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64" exitCode=0 Apr 16 20:25:33.497723 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:33.497614 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" event={"ID":"3ed75c45-c2ee-454f-a790-25274298b31d","Type":"ContainerDied","Data":"70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64"} Apr 16 20:25:33.499135 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:33.499108 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:25:33.915026 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:33.915007 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:25:34.029414 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.029389 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-kserve-provision-location\") pod \"3ed75c45-c2ee-454f-a790-25274298b31d\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " Apr 16 20:25:34.029536 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.029421 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-cache\") pod \"3ed75c45-c2ee-454f-a790-25274298b31d\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " Apr 16 20:25:34.029536 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.029461 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjdp6\" (UniqueName: \"kubernetes.io/projected/3ed75c45-c2ee-454f-a790-25274298b31d-kube-api-access-pjdp6\") pod \"3ed75c45-c2ee-454f-a790-25274298b31d\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " Apr 16 20:25:34.029536 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.029482 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-tmp\") pod \"3ed75c45-c2ee-454f-a790-25274298b31d\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " Apr 16 20:25:34.029536 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.029499 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ed75c45-c2ee-454f-a790-25274298b31d-tls-certs\") pod \"3ed75c45-c2ee-454f-a790-25274298b31d\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " Apr 16 20:25:34.029536 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.029519 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-uds\") pod \"3ed75c45-c2ee-454f-a790-25274298b31d\" (UID: \"3ed75c45-c2ee-454f-a790-25274298b31d\") " Apr 16 20:25:34.029816 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.029786 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "3ed75c45-c2ee-454f-a790-25274298b31d" (UID: "3ed75c45-c2ee-454f-a790-25274298b31d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:34.029879 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.029832 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "3ed75c45-c2ee-454f-a790-25274298b31d" (UID: "3ed75c45-c2ee-454f-a790-25274298b31d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:34.029879 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.029864 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "3ed75c45-c2ee-454f-a790-25274298b31d" (UID: "3ed75c45-c2ee-454f-a790-25274298b31d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:34.030160 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.030137 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3ed75c45-c2ee-454f-a790-25274298b31d" (UID: "3ed75c45-c2ee-454f-a790-25274298b31d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:34.031473 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.031450 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed75c45-c2ee-454f-a790-25274298b31d-kube-api-access-pjdp6" (OuterVolumeSpecName: "kube-api-access-pjdp6") pod "3ed75c45-c2ee-454f-a790-25274298b31d" (UID: "3ed75c45-c2ee-454f-a790-25274298b31d"). InnerVolumeSpecName "kube-api-access-pjdp6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:25:34.031553 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.031536 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed75c45-c2ee-454f-a790-25274298b31d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3ed75c45-c2ee-454f-a790-25274298b31d" (UID: "3ed75c45-c2ee-454f-a790-25274298b31d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:25:34.130793 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.130772 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pjdp6\" (UniqueName: \"kubernetes.io/projected/3ed75c45-c2ee-454f-a790-25274298b31d-kube-api-access-pjdp6\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:25:34.130793 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.130795 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-tmp\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:25:34.130971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.130809 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ed75c45-c2ee-454f-a790-25274298b31d-tls-certs\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:25:34.130971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.130821 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-uds\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:25:34.130971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.130833 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-kserve-provision-location\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:25:34.130971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.130845 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ed75c45-c2ee-454f-a790-25274298b31d-tokenizer-cache\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:25:34.502138 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.502105 2576 generic.go:358] "Generic (PLEG): container finished" podID="3ed75c45-c2ee-454f-a790-25274298b31d" containerID="aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65" exitCode=0 Apr 16 20:25:34.502262 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.502181 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" Apr 16 20:25:34.502262 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.502185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" event={"ID":"3ed75c45-c2ee-454f-a790-25274298b31d","Type":"ContainerDied","Data":"aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65"} Apr 16 20:25:34.502262 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.502228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk" event={"ID":"3ed75c45-c2ee-454f-a790-25274298b31d","Type":"ContainerDied","Data":"98db70d87cf6f51d111f0742e81d6da58a04ceb670b75ecbc610e3379f20cb80"} Apr 16 20:25:34.502262 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.502244 2576 scope.go:117] "RemoveContainer" containerID="aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65" Apr 16 20:25:34.509641 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.509620 2576 scope.go:117] "RemoveContainer" containerID="70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64" Apr 16 20:25:34.516173 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.516156 2576 scope.go:117] "RemoveContainer" containerID="c0c224b9246f85a9054c328c1381092332f169d0a2b5b153c55c5eb37935a823" Apr 16 20:25:34.519764 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.519743 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk"] Apr 16 20:25:34.523716 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.523689 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cl22wk"] Apr 16 20:25:34.524766 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.524672 2576 scope.go:117] "RemoveContainer" containerID="aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65" Apr 16 20:25:34.524950 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:25:34.524931 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65\": container with ID starting with aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65 not found: ID does not exist" containerID="aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65" Apr 16 20:25:34.525027 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.524958 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65"} err="failed to get container status \"aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65\": rpc error: code = NotFound desc = could not find container \"aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65\": container with ID starting with aa7309ae3d537fd1feb00ed22cfc2586f29a2ddde6c0535bd65a45659c884f65 not found: ID does not exist" Apr 16 20:25:34.525027 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.524974 2576 scope.go:117] "RemoveContainer" containerID="70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64" Apr 16 20:25:34.525225 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:25:34.525206 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64\": container with ID starting with 70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64 not found: ID does not exist" containerID="70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64" Apr 16 20:25:34.525265 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.525233 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64"} err="failed to get container status \"70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64\": rpc error: code = NotFound desc = could not find container \"70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64\": container with ID starting with 70aaa8ec49b1da78502c107d2a77790829d8f1582a45ac69d77a8a6ffd78ce64 not found: ID does not exist" Apr 16 20:25:34.525265 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.525250 2576 scope.go:117] "RemoveContainer" containerID="c0c224b9246f85a9054c328c1381092332f169d0a2b5b153c55c5eb37935a823" Apr 16 20:25:34.525455 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:25:34.525441 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c224b9246f85a9054c328c1381092332f169d0a2b5b153c55c5eb37935a823\": container with ID starting with c0c224b9246f85a9054c328c1381092332f169d0a2b5b153c55c5eb37935a823 not found: ID does not exist" containerID="c0c224b9246f85a9054c328c1381092332f169d0a2b5b153c55c5eb37935a823" Apr 16 20:25:34.525486 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:34.525459 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c224b9246f85a9054c328c1381092332f169d0a2b5b153c55c5eb37935a823"} err="failed to get container status \"c0c224b9246f85a9054c328c1381092332f169d0a2b5b153c55c5eb37935a823\": rpc error: code = NotFound desc = could not find container \"c0c224b9246f85a9054c328c1381092332f169d0a2b5b153c55c5eb37935a823\": container with ID starting with c0c224b9246f85a9054c328c1381092332f169d0a2b5b153c55c5eb37935a823 not found: ID does not exist" Apr 16 20:25:36.060815 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:36.060782 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed75c45-c2ee-454f-a790-25274298b31d" path="/var/lib/kubelet/pods/3ed75c45-c2ee-454f-a790-25274298b31d/volumes" Apr 16 20:25:54.504034 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:25:54.503999 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:27:39.399689 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:39.399653 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8"] Apr 16 20:27:39.402355 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:39.400064 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" podUID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerName="main" containerID="cri-o://8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7" gracePeriod=30 Apr 16 20:27:39.402355 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:39.400140 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" podUID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerName="tokenizer" containerID="cri-o://a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f" gracePeriod=30 Apr 16 20:27:39.878798 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:39.878764 2576 generic.go:358] "Generic (PLEG): container finished" podID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerID="8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7" exitCode=0 Apr 16 20:27:39.878964 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:39.878839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" event={"ID":"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2","Type":"ContainerDied","Data":"8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7"} Apr 16 20:27:40.453438 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.453417 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:27:40.562488 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.562424 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbpz5\" (UniqueName: \"kubernetes.io/projected/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-kube-api-access-rbpz5\") pod \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " Apr 16 20:27:40.562488 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.562454 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-cache\") pod \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " Apr 16 20:27:40.562488 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.562478 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-uds\") pod \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " Apr 16 20:27:40.562666 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.562524 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-kserve-provision-location\") pod \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " Apr 16 20:27:40.562666 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.562557 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-tmp\") pod \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " Apr 16 20:27:40.562666 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.562586 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tls-certs\") pod \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\" (UID: \"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2\") " Apr 16 20:27:40.562809 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.562725 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" (UID: "804d37ec-8773-4e3b-bfbc-6adc3e8b62d2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:27:40.562809 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.562772 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" (UID: "804d37ec-8773-4e3b-bfbc-6adc3e8b62d2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:27:40.562919 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.562865 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-cache\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:27:40.562919 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.562885 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-uds\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:27:40.562919 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.562909 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" (UID: "804d37ec-8773-4e3b-bfbc-6adc3e8b62d2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:27:40.563306 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.563286 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" (UID: "804d37ec-8773-4e3b-bfbc-6adc3e8b62d2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:27:40.564374 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.564355 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-kube-api-access-rbpz5" (OuterVolumeSpecName: "kube-api-access-rbpz5") pod "804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" (UID: "804d37ec-8773-4e3b-bfbc-6adc3e8b62d2"). InnerVolumeSpecName "kube-api-access-rbpz5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:27:40.564642 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.564621 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" (UID: "804d37ec-8773-4e3b-bfbc-6adc3e8b62d2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:27:40.663677 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.663654 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tls-certs\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:27:40.663677 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.663673 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbpz5\" (UniqueName: \"kubernetes.io/projected/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-kube-api-access-rbpz5\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:27:40.663792 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.663682 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-kserve-provision-location\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:27:40.663792 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.663692 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2-tokenizer-tmp\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:27:40.883386 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.883326 2576 generic.go:358] "Generic (PLEG): container finished" podID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerID="a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f" exitCode=0 Apr 16 20:27:40.883492 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.883407 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" Apr 16 20:27:40.883492 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.883411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" event={"ID":"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2","Type":"ContainerDied","Data":"a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f"} Apr 16 20:27:40.883492 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.883447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8" event={"ID":"804d37ec-8773-4e3b-bfbc-6adc3e8b62d2","Type":"ContainerDied","Data":"838efdf376c6848ce8b7dce9cb89c557272689152f0dd2679779a7526cb09257"} Apr 16 20:27:40.883492 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.883462 2576 scope.go:117] "RemoveContainer" containerID="a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f" Apr 16 20:27:40.891023 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.891000 2576 scope.go:117] "RemoveContainer" containerID="8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7" Apr 16 20:27:40.897837 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.897823 2576 scope.go:117] "RemoveContainer" containerID="64ff63b09f94384aac6f9ae7481c0c7f0c460f17e6876bb63a21c35552864d05" Apr 16 20:27:40.903445 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.903424 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8"] Apr 16 20:27:40.904197 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.904176 2576 scope.go:117] "RemoveContainer" containerID="a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f" Apr 16 20:27:40.904432 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:27:40.904409 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f\": container with ID starting with a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f not found: ID does not exist" containerID="a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f" Apr 16 20:27:40.904509 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.904438 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f"} err="failed to get container status \"a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f\": rpc error: code = NotFound desc = could not find container \"a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f\": container with ID starting with a3bb75ceb121be2d87dbbf5ae7b664dca93bc3482efcf692202cfde637c9da1f not found: ID does not exist" Apr 16 20:27:40.904509 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.904455 2576 scope.go:117] "RemoveContainer" containerID="8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7" Apr 16 20:27:40.904699 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:27:40.904677 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7\": container with ID starting with 8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7 not found: ID does not exist" containerID="8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7" Apr 16 20:27:40.904735 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.904708 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7"} err="failed to get container status \"8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7\": rpc error: code = NotFound desc = could not find container \"8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7\": container with ID starting with 8b36314cb290afd5722c1fc1b6c2458205fe28f3881d6e6a8895f279a37d9bd7 not found: ID does not exist" Apr 16 20:27:40.904735 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.904731 2576 scope.go:117] "RemoveContainer" containerID="64ff63b09f94384aac6f9ae7481c0c7f0c460f17e6876bb63a21c35552864d05" Apr 16 20:27:40.904965 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:27:40.904947 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64ff63b09f94384aac6f9ae7481c0c7f0c460f17e6876bb63a21c35552864d05\": container with ID starting with 64ff63b09f94384aac6f9ae7481c0c7f0c460f17e6876bb63a21c35552864d05 not found: ID does not exist" containerID="64ff63b09f94384aac6f9ae7481c0c7f0c460f17e6876bb63a21c35552864d05" Apr 16 20:27:40.905090 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.904970 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64ff63b09f94384aac6f9ae7481c0c7f0c460f17e6876bb63a21c35552864d05"} err="failed to get container status \"64ff63b09f94384aac6f9ae7481c0c7f0c460f17e6876bb63a21c35552864d05\": rpc error: code = NotFound desc = could not find container \"64ff63b09f94384aac6f9ae7481c0c7f0c460f17e6876bb63a21c35552864d05\": container with ID starting with 64ff63b09f94384aac6f9ae7481c0c7f0c460f17e6876bb63a21c35552864d05 not found: ID does not exist" Apr 16 20:27:40.907684 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:40.907665 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetrjr8"] Apr 16 20:27:41.637215 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637178 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7"] Apr 16 20:27:41.637682 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637628 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed75c45-c2ee-454f-a790-25274298b31d" containerName="storage-initializer" Apr 16 20:27:41.637682 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637648 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed75c45-c2ee-454f-a790-25274298b31d" containerName="storage-initializer" Apr 16 20:27:41.637682 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637665 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerName="main" Apr 16 20:27:41.637682 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637674 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerName="main" Apr 16 20:27:41.637971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637686 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerName="storage-initializer" Apr 16 20:27:41.637971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637695 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerName="storage-initializer" Apr 16 20:27:41.637971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637707 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed75c45-c2ee-454f-a790-25274298b31d" containerName="main" Apr 16 20:27:41.637971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637715 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed75c45-c2ee-454f-a790-25274298b31d" containerName="main" Apr 16 20:27:41.637971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637733 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerName="tokenizer" Apr 16 20:27:41.637971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637742 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerName="tokenizer" Apr 16 20:27:41.637971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637751 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed75c45-c2ee-454f-a790-25274298b31d" containerName="tokenizer" Apr 16 20:27:41.637971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637759 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed75c45-c2ee-454f-a790-25274298b31d" containerName="tokenizer" Apr 16 20:27:41.637971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637842 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ed75c45-c2ee-454f-a790-25274298b31d" containerName="main" Apr 16 20:27:41.637971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637852 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ed75c45-c2ee-454f-a790-25274298b31d" containerName="tokenizer" Apr 16 20:27:41.637971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637863 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerName="main" Apr 16 20:27:41.637971 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.637871 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" containerName="tokenizer" Apr 16 20:27:41.648422 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.648393 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.649848 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.649821 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7"] Apr 16 20:27:41.650795 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.650774 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-xnrb6\"" Apr 16 20:27:41.651693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.651669 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 20:27:41.651693 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.651684 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:27:41.770246 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.770212 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.770445 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.770265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e726a57e-c258-4d54-bdc1-1ccf85000b53-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.770445 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.770301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.770445 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.770325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6nrc\" (UniqueName: \"kubernetes.io/projected/e726a57e-c258-4d54-bdc1-1ccf85000b53-kube-api-access-c6nrc\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.770445 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.770353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.770445 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.770403 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.871342 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.871306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.871539 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.871369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.871539 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.871405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e726a57e-c258-4d54-bdc1-1ccf85000b53-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.871539 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.871451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.871539 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.871477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6nrc\" (UniqueName: \"kubernetes.io/projected/e726a57e-c258-4d54-bdc1-1ccf85000b53-kube-api-access-c6nrc\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.871539 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.871506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.871777 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.871754 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.871827 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.871794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.871863 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.871843 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.871902 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.871862 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.873758 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.873740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e726a57e-c258-4d54-bdc1-1ccf85000b53-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.880715 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.880693 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6nrc\" (UniqueName: \"kubernetes.io/projected/e726a57e-c258-4d54-bdc1-1ccf85000b53-kube-api-access-c6nrc\") pod \"custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:41.958297 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:41.958250 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:42.059602 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:42.059565 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804d37ec-8773-4e3b-bfbc-6adc3e8b62d2" path="/var/lib/kubelet/pods/804d37ec-8773-4e3b-bfbc-6adc3e8b62d2/volumes" Apr 16 20:27:42.084122 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:42.084100 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7"] Apr 16 20:27:42.086344 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:27:42.086316 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode726a57e_c258_4d54_bdc1_1ccf85000b53.slice/crio-8be714fcd541c92e2e4adf885d3e0b8eedeb9ffd1950a8f7eee9f941be218697 WatchSource:0}: Error finding container 8be714fcd541c92e2e4adf885d3e0b8eedeb9ffd1950a8f7eee9f941be218697: Status 404 returned error can't find the container with id 8be714fcd541c92e2e4adf885d3e0b8eedeb9ffd1950a8f7eee9f941be218697 Apr 16 20:27:42.892443 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:42.892408 2576 generic.go:358] "Generic (PLEG): container finished" podID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerID="bc394c8be12b794e291987c9f2543dbacc4bb9b2ad8f5bf47cde9e1d86780cfb" exitCode=0 Apr 16 20:27:42.892759 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:42.892495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" event={"ID":"e726a57e-c258-4d54-bdc1-1ccf85000b53","Type":"ContainerDied","Data":"bc394c8be12b794e291987c9f2543dbacc4bb9b2ad8f5bf47cde9e1d86780cfb"} Apr 16 20:27:42.892759 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:42.892529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" event={"ID":"e726a57e-c258-4d54-bdc1-1ccf85000b53","Type":"ContainerStarted","Data":"8be714fcd541c92e2e4adf885d3e0b8eedeb9ffd1950a8f7eee9f941be218697"} Apr 16 20:27:43.897843 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:43.897810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" event={"ID":"e726a57e-c258-4d54-bdc1-1ccf85000b53","Type":"ContainerStarted","Data":"fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a"} Apr 16 20:27:43.897843 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:43.897842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" event={"ID":"e726a57e-c258-4d54-bdc1-1ccf85000b53","Type":"ContainerStarted","Data":"924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7"} Apr 16 20:27:43.898346 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:43.897916 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:43.920881 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:43.920833 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" podStartSLOduration=2.920817966 podStartE2EDuration="2.920817966s" podCreationTimestamp="2026-04-16 20:27:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:43.917969868 +0000 UTC m=+964.452325594" watchObservedRunningTime="2026-04-16 20:27:43.920817966 +0000 UTC m=+964.455173693" Apr 16 20:27:51.959099 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:51.958972 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:51.959099 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:51.959056 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:51.961712 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:51.961689 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:27:52.927151 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:27:52.927119 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:28:13.935750 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:28:13.935719 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:29:27.403454 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:27.403366 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7"] Apr 16 20:29:27.403957 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:27.403687 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" podUID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerName="main" containerID="cri-o://924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7" gracePeriod=30 Apr 16 20:29:27.403957 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:27.403757 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" podUID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerName="tokenizer" containerID="cri-o://fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a" gracePeriod=30 Apr 16 20:29:28.230905 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.230873 2576 generic.go:358] "Generic (PLEG): container finished" podID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerID="924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7" exitCode=0 Apr 16 20:29:28.231105 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.230947 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" event={"ID":"e726a57e-c258-4d54-bdc1-1ccf85000b53","Type":"ContainerDied","Data":"924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7"} Apr 16 20:29:28.545143 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.545123 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:29:28.672901 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.672875 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-cache\") pod \"e726a57e-c258-4d54-bdc1-1ccf85000b53\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " Apr 16 20:29:28.673086 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.672918 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-uds\") pod \"e726a57e-c258-4d54-bdc1-1ccf85000b53\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " Apr 16 20:29:28.673086 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.673024 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-kserve-provision-location\") pod \"e726a57e-c258-4d54-bdc1-1ccf85000b53\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " Apr 16 20:29:28.673086 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.673057 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e726a57e-c258-4d54-bdc1-1ccf85000b53-tls-certs\") pod \"e726a57e-c258-4d54-bdc1-1ccf85000b53\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " Apr 16 20:29:28.673086 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.673080 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6nrc\" (UniqueName: \"kubernetes.io/projected/e726a57e-c258-4d54-bdc1-1ccf85000b53-kube-api-access-c6nrc\") pod \"e726a57e-c258-4d54-bdc1-1ccf85000b53\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " Apr 16 20:29:28.673291 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.673109 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-tmp\") pod \"e726a57e-c258-4d54-bdc1-1ccf85000b53\" (UID: \"e726a57e-c258-4d54-bdc1-1ccf85000b53\") " Apr 16 20:29:28.673291 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.673201 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e726a57e-c258-4d54-bdc1-1ccf85000b53" (UID: "e726a57e-c258-4d54-bdc1-1ccf85000b53"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:29:28.673291 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.673216 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e726a57e-c258-4d54-bdc1-1ccf85000b53" (UID: "e726a57e-c258-4d54-bdc1-1ccf85000b53"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:29:28.673421 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.673353 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-cache\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:29:28.673421 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.673373 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-uds\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:29:28.673488 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.673441 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e726a57e-c258-4d54-bdc1-1ccf85000b53" (UID: "e726a57e-c258-4d54-bdc1-1ccf85000b53"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:29:28.673736 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.673717 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e726a57e-c258-4d54-bdc1-1ccf85000b53" (UID: "e726a57e-c258-4d54-bdc1-1ccf85000b53"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:29:28.675063 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.675042 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e726a57e-c258-4d54-bdc1-1ccf85000b53-kube-api-access-c6nrc" (OuterVolumeSpecName: "kube-api-access-c6nrc") pod "e726a57e-c258-4d54-bdc1-1ccf85000b53" (UID: "e726a57e-c258-4d54-bdc1-1ccf85000b53"). InnerVolumeSpecName "kube-api-access-c6nrc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:29:28.675128 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.675082 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e726a57e-c258-4d54-bdc1-1ccf85000b53-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e726a57e-c258-4d54-bdc1-1ccf85000b53" (UID: "e726a57e-c258-4d54-bdc1-1ccf85000b53"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:29:28.774710 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.774679 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-kserve-provision-location\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:29:28.774710 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.774706 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e726a57e-c258-4d54-bdc1-1ccf85000b53-tls-certs\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:29:28.774870 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.774719 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c6nrc\" (UniqueName: \"kubernetes.io/projected/e726a57e-c258-4d54-bdc1-1ccf85000b53-kube-api-access-c6nrc\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:29:28.774870 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:28.774732 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e726a57e-c258-4d54-bdc1-1ccf85000b53-tokenizer-tmp\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:29:29.235197 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.235164 2576 generic.go:358] "Generic (PLEG): container finished" podID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerID="fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a" exitCode=0 Apr 16 20:29:29.235389 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.235240 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" Apr 16 20:29:29.235389 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.235239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" event={"ID":"e726a57e-c258-4d54-bdc1-1ccf85000b53","Type":"ContainerDied","Data":"fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a"} Apr 16 20:29:29.235389 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.235342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7" event={"ID":"e726a57e-c258-4d54-bdc1-1ccf85000b53","Type":"ContainerDied","Data":"8be714fcd541c92e2e4adf885d3e0b8eedeb9ffd1950a8f7eee9f941be218697"} Apr 16 20:29:29.235389 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.235357 2576 scope.go:117] "RemoveContainer" containerID="fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a" Apr 16 20:29:29.247372 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.247347 2576 scope.go:117] "RemoveContainer" containerID="924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7" Apr 16 20:29:29.256224 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.256206 2576 scope.go:117] "RemoveContainer" containerID="bc394c8be12b794e291987c9f2543dbacc4bb9b2ad8f5bf47cde9e1d86780cfb" Apr 16 20:29:29.263062 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.263037 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7"] Apr 16 20:29:29.266097 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.266054 2576 scope.go:117] "RemoveContainer" containerID="fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a" Apr 16 20:29:29.266438 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:29:29.266408 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a\": container with ID starting with fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a not found: ID does not exist" containerID="fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a" Apr 16 20:29:29.266529 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.266450 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a"} err="failed to get container status \"fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a\": rpc error: code = NotFound desc = could not find container \"fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a\": container with ID starting with fd57a27899778ec8c002aaa70e9309f7317188c36b9bf86da1b9c1042926db1a not found: ID does not exist" Apr 16 20:29:29.266529 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.266479 2576 scope.go:117] "RemoveContainer" containerID="924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7" Apr 16 20:29:29.266770 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:29:29.266750 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7\": container with ID starting with 924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7 not found: ID does not exist" containerID="924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7" Apr 16 20:29:29.266848 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.266778 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7"} err="failed to get container status \"924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7\": rpc error: code = NotFound desc = could not find container \"924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7\": container with ID starting with 924e5ab1ca19802a2e0444d28868e3dfe0eacbc9077297871be954b5b8f55fa7 not found: ID does not exist" Apr 16 20:29:29.266848 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.266800 2576 scope.go:117] "RemoveContainer" containerID="bc394c8be12b794e291987c9f2543dbacc4bb9b2ad8f5bf47cde9e1d86780cfb" Apr 16 20:29:29.266945 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.266859 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7499df654tfh7"] Apr 16 20:29:29.267166 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:29:29.267132 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc394c8be12b794e291987c9f2543dbacc4bb9b2ad8f5bf47cde9e1d86780cfb\": container with ID starting with bc394c8be12b794e291987c9f2543dbacc4bb9b2ad8f5bf47cde9e1d86780cfb not found: ID does not exist" containerID="bc394c8be12b794e291987c9f2543dbacc4bb9b2ad8f5bf47cde9e1d86780cfb" Apr 16 20:29:29.267235 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:29.267168 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc394c8be12b794e291987c9f2543dbacc4bb9b2ad8f5bf47cde9e1d86780cfb"} err="failed to get container status \"bc394c8be12b794e291987c9f2543dbacc4bb9b2ad8f5bf47cde9e1d86780cfb\": rpc error: code = NotFound desc = could not find container \"bc394c8be12b794e291987c9f2543dbacc4bb9b2ad8f5bf47cde9e1d86780cfb\": container with ID starting with bc394c8be12b794e291987c9f2543dbacc4bb9b2ad8f5bf47cde9e1d86780cfb not found: ID does not exist" Apr 16 20:29:30.058569 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:30.058530 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e726a57e-c258-4d54-bdc1-1ccf85000b53" path="/var/lib/kubelet/pods/e726a57e-c258-4d54-bdc1-1ccf85000b53/volumes" Apr 16 20:29:39.091228 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.091195 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm"] Apr 16 20:29:39.091653 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.091604 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerName="storage-initializer" Apr 16 20:29:39.091653 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.091621 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerName="storage-initializer" Apr 16 20:29:39.091653 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.091653 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerName="main" Apr 16 20:29:39.091808 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.091662 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerName="main" Apr 16 20:29:39.091808 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.091674 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerName="tokenizer" Apr 16 20:29:39.091808 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.091683 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerName="tokenizer" Apr 16 20:29:39.091808 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.091751 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerName="tokenizer" Apr 16 20:29:39.091808 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.091766 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e726a57e-c258-4d54-bdc1-1ccf85000b53" containerName="main" Apr 16 20:29:39.096852 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.096830 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.100032 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.100004 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 20:29:39.100032 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.100022 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:29:39.100186 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.100010 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-25dnc\"" Apr 16 20:29:39.106137 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.106117 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm"] Apr 16 20:29:39.146127 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.146105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.146236 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.146138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.146236 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.146171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.146236 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.146212 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.146236 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.146227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.146365 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.146302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6qgw\" (UniqueName: \"kubernetes.io/projected/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-kube-api-access-z6qgw\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.247583 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.247551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.247583 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.247583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.247773 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.247612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6qgw\" (UniqueName: \"kubernetes.io/projected/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-kube-api-access-z6qgw\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.247773 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.247653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.247773 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.247677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.247773 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.247723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.248026 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.248003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.248077 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.248025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.248077 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.248064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.248174 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.248158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.250039 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.250023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.257072 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.257049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6qgw\" (UniqueName: \"kubernetes.io/projected/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-kube-api-access-z6qgw\") pod \"router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.405497 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.405437 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:39.528538 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.528515 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm"] Apr 16 20:29:39.530632 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:29:39.530605 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode358f5f8_1c4d_4ef9_bae8_2cbf004ffae5.slice/crio-349cefe33675a9119f085ce97ec51c4ffbde385b05ab063f9ad6be207d3922e1 WatchSource:0}: Error finding container 349cefe33675a9119f085ce97ec51c4ffbde385b05ab063f9ad6be207d3922e1: Status 404 returned error can't find the container with id 349cefe33675a9119f085ce97ec51c4ffbde385b05ab063f9ad6be207d3922e1 Apr 16 20:29:39.532425 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:39.532411 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:29:40.274450 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:40.274418 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" event={"ID":"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5","Type":"ContainerStarted","Data":"d90f201a83525ded0eb0181ebfd12f767a09e21fe0c0078291ccb6ba23164438"} Apr 16 20:29:40.274716 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:40.274462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" event={"ID":"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5","Type":"ContainerStarted","Data":"349cefe33675a9119f085ce97ec51c4ffbde385b05ab063f9ad6be207d3922e1"} Apr 16 20:29:41.280667 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:41.278851 2576 generic.go:358] "Generic (PLEG): container finished" podID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerID="d90f201a83525ded0eb0181ebfd12f767a09e21fe0c0078291ccb6ba23164438" exitCode=0 Apr 16 20:29:41.280667 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:41.278893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" event={"ID":"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5","Type":"ContainerDied","Data":"d90f201a83525ded0eb0181ebfd12f767a09e21fe0c0078291ccb6ba23164438"} Apr 16 20:29:42.283455 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:42.283420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" event={"ID":"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5","Type":"ContainerStarted","Data":"6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d"} Apr 16 20:29:42.283455 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:42.283455 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" event={"ID":"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5","Type":"ContainerStarted","Data":"6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90"} Apr 16 20:29:42.283937 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:42.283541 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:42.303707 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:42.303657 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" podStartSLOduration=3.303644165 podStartE2EDuration="3.303644165s" podCreationTimestamp="2026-04-16 20:29:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:29:42.301896806 +0000 UTC m=+1082.836252560" watchObservedRunningTime="2026-04-16 20:29:42.303644165 +0000 UTC m=+1082.837999890" Apr 16 20:29:49.406144 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:49.406113 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:49.406144 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:49.406151 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:49.408608 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:49.408585 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:29:50.308885 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:29:50.308854 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:30:11.312619 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:30:11.312591 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:31:29.968591 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:29.968554 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm"] Apr 16 20:31:29.968995 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:29.968942 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" podUID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerName="main" containerID="cri-o://6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90" gracePeriod=30 Apr 16 20:31:29.969122 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:29.969047 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" podUID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerName="tokenizer" containerID="cri-o://6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d" gracePeriod=30 Apr 16 20:31:30.308354 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:30.308265 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" podUID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.30:8082/healthz\": dial tcp 10.132.0.30:8082: connect: connection refused" Apr 16 20:31:30.601431 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:30.601340 2576 generic.go:358] "Generic (PLEG): container finished" podID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerID="6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90" exitCode=0 Apr 16 20:31:30.601574 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:30.601422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" event={"ID":"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5","Type":"ContainerDied","Data":"6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90"} Apr 16 20:31:31.011633 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.011610 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:31:31.163166 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.163115 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-kserve-provision-location\") pod \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " Apr 16 20:31:31.163166 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.163144 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-cache\") pod \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " Apr 16 20:31:31.163330 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.163173 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6qgw\" (UniqueName: \"kubernetes.io/projected/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-kube-api-access-z6qgw\") pod \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " Apr 16 20:31:31.163330 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.163193 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-uds\") pod \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " Apr 16 20:31:31.163330 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.163217 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-tmp\") pod \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " Apr 16 20:31:31.163330 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.163248 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tls-certs\") pod \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\" (UID: \"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5\") " Apr 16 20:31:31.163516 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.163468 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" (UID: "e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:31.163516 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.163479 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" (UID: "e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:31.163621 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.163601 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" (UID: "e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:31.163909 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.163888 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" (UID: "e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:31.165156 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.165138 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-kube-api-access-z6qgw" (OuterVolumeSpecName: "kube-api-access-z6qgw") pod "e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" (UID: "e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5"). InnerVolumeSpecName "kube-api-access-z6qgw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:31:31.165208 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.165190 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" (UID: "e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:31.264365 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.264342 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z6qgw\" (UniqueName: \"kubernetes.io/projected/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-kube-api-access-z6qgw\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:31:31.264365 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.264362 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-uds\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:31:31.264486 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.264372 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-tmp\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:31:31.264486 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.264381 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tls-certs\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:31:31.264486 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.264389 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-kserve-provision-location\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:31:31.264486 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.264397 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5-tokenizer-cache\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:31:31.606018 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.605992 2576 generic.go:358] "Generic (PLEG): container finished" podID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerID="6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d" exitCode=0 Apr 16 20:31:31.606119 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.606075 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" Apr 16 20:31:31.606119 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.606076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" event={"ID":"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5","Type":"ContainerDied","Data":"6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d"} Apr 16 20:31:31.606119 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.606112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm" event={"ID":"e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5","Type":"ContainerDied","Data":"349cefe33675a9119f085ce97ec51c4ffbde385b05ab063f9ad6be207d3922e1"} Apr 16 20:31:31.606221 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.606128 2576 scope.go:117] "RemoveContainer" containerID="6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d" Apr 16 20:31:31.613510 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.613390 2576 scope.go:117] "RemoveContainer" containerID="6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90" Apr 16 20:31:31.619638 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.619620 2576 scope.go:117] "RemoveContainer" containerID="d90f201a83525ded0eb0181ebfd12f767a09e21fe0c0078291ccb6ba23164438" Apr 16 20:31:31.626287 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.626268 2576 scope.go:117] "RemoveContainer" containerID="6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d" Apr 16 20:31:31.626566 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:31:31.626540 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d\": container with ID starting with 6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d not found: ID does not exist" containerID="6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d" Apr 16 20:31:31.626673 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.626571 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d"} err="failed to get container status \"6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d\": rpc error: code = NotFound desc = could not find container \"6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d\": container with ID starting with 6e4c05e749dfb605d8e5c80bdf453ec020efcb7106baab3e87d197a37dd4b32d not found: ID does not exist" Apr 16 20:31:31.626673 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.626588 2576 scope.go:117] "RemoveContainer" containerID="6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90" Apr 16 20:31:31.626862 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:31:31.626845 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90\": container with ID starting with 6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90 not found: ID does not exist" containerID="6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90" Apr 16 20:31:31.626923 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.626871 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90"} err="failed to get container status \"6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90\": rpc error: code = NotFound desc = could not find container \"6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90\": container with ID starting with 6db4029be147a462cc1c14504405f48008378e905a994bdb099d569af65cdb90 not found: ID does not exist" Apr 16 20:31:31.626923 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.626893 2576 scope.go:117] "RemoveContainer" containerID="d90f201a83525ded0eb0181ebfd12f767a09e21fe0c0078291ccb6ba23164438" Apr 16 20:31:31.627186 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:31:31.627167 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d90f201a83525ded0eb0181ebfd12f767a09e21fe0c0078291ccb6ba23164438\": container with ID starting with d90f201a83525ded0eb0181ebfd12f767a09e21fe0c0078291ccb6ba23164438 not found: ID does not exist" containerID="d90f201a83525ded0eb0181ebfd12f767a09e21fe0c0078291ccb6ba23164438" Apr 16 20:31:31.627251 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.627190 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d90f201a83525ded0eb0181ebfd12f767a09e21fe0c0078291ccb6ba23164438"} err="failed to get container status \"d90f201a83525ded0eb0181ebfd12f767a09e21fe0c0078291ccb6ba23164438\": rpc error: code = NotFound desc = could not find container \"d90f201a83525ded0eb0181ebfd12f767a09e21fe0c0078291ccb6ba23164438\": container with ID starting with d90f201a83525ded0eb0181ebfd12f767a09e21fe0c0078291ccb6ba23164438 not found: ID does not exist" Apr 16 20:31:31.627974 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.627957 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm"] Apr 16 20:31:31.631486 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:31.631463 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-65dccbbb8f-mf6rm"] Apr 16 20:31:32.058310 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:32.058283 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" path="/var/lib/kubelet/pods/e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5/volumes" Apr 16 20:31:39.588341 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.588311 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n"] Apr 16 20:31:39.588791 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.588711 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerName="storage-initializer" Apr 16 20:31:39.588791 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.588731 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerName="storage-initializer" Apr 16 20:31:39.588791 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.588748 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerName="main" Apr 16 20:31:39.588791 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.588757 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerName="main" Apr 16 20:31:39.588791 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.588786 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerName="tokenizer" Apr 16 20:31:39.588791 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.588795 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerName="tokenizer" Apr 16 20:31:39.589117 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.588883 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerName="tokenizer" Apr 16 20:31:39.589117 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.588895 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e358f5f8-1c4d-4ef9-bae8-2cbf004ffae5" containerName="main" Apr 16 20:31:39.593739 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.593717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.596214 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.596193 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:31:39.597049 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.597023 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-rcgrr\"" Apr 16 20:31:39.597156 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.597028 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 20:31:39.603031 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.603011 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n"] Apr 16 20:31:39.721052 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.721030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.721190 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.721078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.721190 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.721130 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.721190 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.721154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7gc2\" (UniqueName: \"kubernetes.io/projected/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-kube-api-access-w7gc2\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.721300 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.721229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.721300 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.721273 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.822142 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.822110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.822291 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.822150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.822291 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.822167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.822291 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.822184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7gc2\" (UniqueName: \"kubernetes.io/projected/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-kube-api-access-w7gc2\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.822291 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.822216 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.822291 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.822247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.822532 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.822498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.822622 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.822601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.822668 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.822604 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.822668 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.822655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.824698 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.824675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.832964 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.832942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7gc2\" (UniqueName: \"kubernetes.io/projected/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-kube-api-access-w7gc2\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:39.905965 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.905907 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:31:39.917175 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.917151 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-rcgrr\"" Apr 16 20:31:39.925296 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:39.925275 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:40.085719 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:40.085692 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n"] Apr 16 20:31:40.087467 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:31:40.087430 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86755dcb_786d_4bf0_b34d_f306fe3dd4e4.slice/crio-a5e6ad8347e0895d3969729bd626b80aa7e386bef7ae2ec6f14e463882fdaa12 WatchSource:0}: Error finding container a5e6ad8347e0895d3969729bd626b80aa7e386bef7ae2ec6f14e463882fdaa12: Status 404 returned error can't find the container with id a5e6ad8347e0895d3969729bd626b80aa7e386bef7ae2ec6f14e463882fdaa12 Apr 16 20:31:40.635684 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:40.635645 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" event={"ID":"86755dcb-786d-4bf0-b34d-f306fe3dd4e4","Type":"ContainerStarted","Data":"56bde7d99fc349de32a4a3ca173e59144532c0108d27760e4695c145c7b68f31"} Apr 16 20:31:40.635684 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:40.635686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" event={"ID":"86755dcb-786d-4bf0-b34d-f306fe3dd4e4","Type":"ContainerStarted","Data":"a5e6ad8347e0895d3969729bd626b80aa7e386bef7ae2ec6f14e463882fdaa12"} Apr 16 20:31:41.639532 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:41.639497 2576 generic.go:358] "Generic (PLEG): container finished" podID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerID="56bde7d99fc349de32a4a3ca173e59144532c0108d27760e4695c145c7b68f31" exitCode=0 Apr 16 20:31:41.639884 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:41.639549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" event={"ID":"86755dcb-786d-4bf0-b34d-f306fe3dd4e4","Type":"ContainerDied","Data":"56bde7d99fc349de32a4a3ca173e59144532c0108d27760e4695c145c7b68f31"} Apr 16 20:31:42.644845 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:42.644810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" event={"ID":"86755dcb-786d-4bf0-b34d-f306fe3dd4e4","Type":"ContainerStarted","Data":"45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b"} Apr 16 20:31:42.644845 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:42.644846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" event={"ID":"86755dcb-786d-4bf0-b34d-f306fe3dd4e4","Type":"ContainerStarted","Data":"b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f"} Apr 16 20:31:42.645250 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:42.644949 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:42.664179 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:42.664132 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" podStartSLOduration=3.664118189 podStartE2EDuration="3.664118189s" podCreationTimestamp="2026-04-16 20:31:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:42.662856172 +0000 UTC m=+1203.197211923" watchObservedRunningTime="2026-04-16 20:31:42.664118189 +0000 UTC m=+1203.198473919" Apr 16 20:31:49.925442 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:49.925388 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:49.925442 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:49.925437 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:49.928160 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:49.928124 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:31:50.671705 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:31:50.671670 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:32:11.675840 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:32:11.675807 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:34:02.429240 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:02.429162 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n"] Apr 16 20:34:02.429797 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:02.429529 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" podUID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerName="main" containerID="cri-o://b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f" gracePeriod=30 Apr 16 20:34:02.429797 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:02.429603 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" podUID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerName="tokenizer" containerID="cri-o://45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b" gracePeriod=30 Apr 16 20:34:03.095028 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.094996 2576 generic.go:358] "Generic (PLEG): container finished" podID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerID="b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f" exitCode=0 Apr 16 20:34:03.095230 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.095085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" event={"ID":"86755dcb-786d-4bf0-b34d-f306fe3dd4e4","Type":"ContainerDied","Data":"b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f"} Apr 16 20:34:03.479597 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.479577 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:34:03.534168 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.534145 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-kserve-provision-location\") pod \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " Apr 16 20:34:03.534259 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.534192 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-uds\") pod \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " Apr 16 20:34:03.534259 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.534213 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tls-certs\") pod \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " Apr 16 20:34:03.534259 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.534235 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-tmp\") pod \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " Apr 16 20:34:03.534378 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.534271 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7gc2\" (UniqueName: \"kubernetes.io/projected/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-kube-api-access-w7gc2\") pod \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " Apr 16 20:34:03.534378 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.534306 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-cache\") pod \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\" (UID: \"86755dcb-786d-4bf0-b34d-f306fe3dd4e4\") " Apr 16 20:34:03.534476 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.534402 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "86755dcb-786d-4bf0-b34d-f306fe3dd4e4" (UID: "86755dcb-786d-4bf0-b34d-f306fe3dd4e4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:03.534580 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.534563 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-uds\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:34:03.534656 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.534623 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "86755dcb-786d-4bf0-b34d-f306fe3dd4e4" (UID: "86755dcb-786d-4bf0-b34d-f306fe3dd4e4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:03.534656 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.534636 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "86755dcb-786d-4bf0-b34d-f306fe3dd4e4" (UID: "86755dcb-786d-4bf0-b34d-f306fe3dd4e4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:03.534992 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.534957 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "86755dcb-786d-4bf0-b34d-f306fe3dd4e4" (UID: "86755dcb-786d-4bf0-b34d-f306fe3dd4e4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:03.536198 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.536178 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "86755dcb-786d-4bf0-b34d-f306fe3dd4e4" (UID: "86755dcb-786d-4bf0-b34d-f306fe3dd4e4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:34:03.536382 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.536365 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-kube-api-access-w7gc2" (OuterVolumeSpecName: "kube-api-access-w7gc2") pod "86755dcb-786d-4bf0-b34d-f306fe3dd4e4" (UID: "86755dcb-786d-4bf0-b34d-f306fe3dd4e4"). InnerVolumeSpecName "kube-api-access-w7gc2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:34:03.635057 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.635007 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-cache\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:34:03.635057 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.635028 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-kserve-provision-location\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:34:03.635057 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.635038 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tls-certs\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:34:03.635057 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.635047 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-tokenizer-tmp\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:34:03.635057 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:03.635055 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w7gc2\" (UniqueName: \"kubernetes.io/projected/86755dcb-786d-4bf0-b34d-f306fe3dd4e4-kube-api-access-w7gc2\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:34:04.099920 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.099891 2576 generic.go:358] "Generic (PLEG): container finished" podID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerID="45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b" exitCode=0 Apr 16 20:34:04.100051 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.099936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" event={"ID":"86755dcb-786d-4bf0-b34d-f306fe3dd4e4","Type":"ContainerDied","Data":"45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b"} Apr 16 20:34:04.100051 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.099956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" event={"ID":"86755dcb-786d-4bf0-b34d-f306fe3dd4e4","Type":"ContainerDied","Data":"a5e6ad8347e0895d3969729bd626b80aa7e386bef7ae2ec6f14e463882fdaa12"} Apr 16 20:34:04.100051 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.099960 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n" Apr 16 20:34:04.100051 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.099970 2576 scope.go:117] "RemoveContainer" containerID="45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b" Apr 16 20:34:04.109043 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.109011 2576 scope.go:117] "RemoveContainer" containerID="b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f" Apr 16 20:34:04.115756 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.115731 2576 scope.go:117] "RemoveContainer" containerID="56bde7d99fc349de32a4a3ca173e59144532c0108d27760e4695c145c7b68f31" Apr 16 20:34:04.119074 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.119049 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n"] Apr 16 20:34:04.122958 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.122933 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schel754n"] Apr 16 20:34:04.124214 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.124194 2576 scope.go:117] "RemoveContainer" containerID="45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b" Apr 16 20:34:04.124492 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:34:04.124475 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b\": container with ID starting with 45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b not found: ID does not exist" containerID="45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b" Apr 16 20:34:04.124556 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.124499 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b"} err="failed to get container status \"45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b\": rpc error: code = NotFound desc = could not find container \"45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b\": container with ID starting with 45f5601bf5541d9c4499ebcc663cada3ecf5c3994c205c804750d6616470fe1b not found: ID does not exist" Apr 16 20:34:04.124556 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.124515 2576 scope.go:117] "RemoveContainer" containerID="b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f" Apr 16 20:34:04.124707 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:34:04.124689 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f\": container with ID starting with b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f not found: ID does not exist" containerID="b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f" Apr 16 20:34:04.124744 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.124713 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f"} err="failed to get container status \"b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f\": rpc error: code = NotFound desc = could not find container \"b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f\": container with ID starting with b6c31bad5838eeb05b17f84c6342321147e62018cdec9a0ec3a4ed95fc32a34f not found: ID does not exist" Apr 16 20:34:04.124744 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.124727 2576 scope.go:117] "RemoveContainer" containerID="56bde7d99fc349de32a4a3ca173e59144532c0108d27760e4695c145c7b68f31" Apr 16 20:34:04.124951 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:34:04.124932 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56bde7d99fc349de32a4a3ca173e59144532c0108d27760e4695c145c7b68f31\": container with ID starting with 56bde7d99fc349de32a4a3ca173e59144532c0108d27760e4695c145c7b68f31 not found: ID does not exist" containerID="56bde7d99fc349de32a4a3ca173e59144532c0108d27760e4695c145c7b68f31" Apr 16 20:34:04.125010 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:04.124954 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56bde7d99fc349de32a4a3ca173e59144532c0108d27760e4695c145c7b68f31"} err="failed to get container status \"56bde7d99fc349de32a4a3ca173e59144532c0108d27760e4695c145c7b68f31\": rpc error: code = NotFound desc = could not find container \"56bde7d99fc349de32a4a3ca173e59144532c0108d27760e4695c145c7b68f31\": container with ID starting with 56bde7d99fc349de32a4a3ca173e59144532c0108d27760e4695c145c7b68f31 not found: ID does not exist" Apr 16 20:34:06.059897 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:06.059865 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" path="/var/lib/kubelet/pods/86755dcb-786d-4bf0-b34d-f306fe3dd4e4/volumes" Apr 16 20:34:18.770361 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.770329 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm"] Apr 16 20:34:18.770718 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.770646 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerName="main" Apr 16 20:34:18.770718 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.770658 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerName="main" Apr 16 20:34:18.770718 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.770668 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerName="storage-initializer" Apr 16 20:34:18.770718 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.770705 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerName="storage-initializer" Apr 16 20:34:18.770718 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.770716 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerName="tokenizer" Apr 16 20:34:18.770906 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.770735 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerName="tokenizer" Apr 16 20:34:18.770906 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.770790 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerName="tokenizer" Apr 16 20:34:18.770906 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.770821 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="86755dcb-786d-4bf0-b34d-f306fe3dd4e4" containerName="main" Apr 16 20:34:18.775758 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.775731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.778047 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.778024 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 20:34:18.778176 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.778087 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-7466c\"" Apr 16 20:34:18.778755 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.778734 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:34:18.784225 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.784207 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm"] Apr 16 20:34:18.858145 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.858123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.858250 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.858152 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd2hr\" (UniqueName: \"kubernetes.io/projected/476b0a54-5285-4647-8df2-0b213b567168-kube-api-access-fd2hr\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.858250 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.858184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.858333 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.858248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.858333 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.858308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/476b0a54-5285-4647-8df2-0b213b567168-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.858333 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.858328 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.959146 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.959115 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.959401 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.959376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/476b0a54-5285-4647-8df2-0b213b567168-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.959878 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.959844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.960028 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.959543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.960028 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.959921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.960028 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.959950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2hr\" (UniqueName: \"kubernetes.io/projected/476b0a54-5285-4647-8df2-0b213b567168-kube-api-access-fd2hr\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.960200 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.960029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.960253 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.960204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.960486 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.960450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.960486 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.960469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.962391 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.962361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/476b0a54-5285-4647-8df2-0b213b567168-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:18.968803 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:18.968781 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd2hr\" (UniqueName: \"kubernetes.io/projected/476b0a54-5285-4647-8df2-0b213b567168-kube-api-access-fd2hr\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:19.086019 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:19.085922 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:19.204156 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:19.204098 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm"] Apr 16 20:34:19.206422 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:34:19.206395 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod476b0a54_5285_4647_8df2_0b213b567168.slice/crio-dbcead9e585b2c7731a0682c925f7e5921895aa07550db7f6f014be83031e93a WatchSource:0}: Error finding container dbcead9e585b2c7731a0682c925f7e5921895aa07550db7f6f014be83031e93a: Status 404 returned error can't find the container with id dbcead9e585b2c7731a0682c925f7e5921895aa07550db7f6f014be83031e93a Apr 16 20:34:20.158715 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:20.158686 2576 generic.go:358] "Generic (PLEG): container finished" podID="476b0a54-5285-4647-8df2-0b213b567168" containerID="246619f077559e9b5c4bdccad9b938f50346369c7bd81875ee1dcc434a9e4f09" exitCode=0 Apr 16 20:34:20.159060 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:20.158766 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" event={"ID":"476b0a54-5285-4647-8df2-0b213b567168","Type":"ContainerDied","Data":"246619f077559e9b5c4bdccad9b938f50346369c7bd81875ee1dcc434a9e4f09"} Apr 16 20:34:20.159060 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:20.158797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" event={"ID":"476b0a54-5285-4647-8df2-0b213b567168","Type":"ContainerStarted","Data":"dbcead9e585b2c7731a0682c925f7e5921895aa07550db7f6f014be83031e93a"} Apr 16 20:34:21.164109 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:21.164068 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" event={"ID":"476b0a54-5285-4647-8df2-0b213b567168","Type":"ContainerStarted","Data":"d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885"} Apr 16 20:34:21.164109 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:21.164100 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" event={"ID":"476b0a54-5285-4647-8df2-0b213b567168","Type":"ContainerStarted","Data":"9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a"} Apr 16 20:34:21.164610 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:21.164194 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:21.184779 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:21.184736 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" podStartSLOduration=3.184725457 podStartE2EDuration="3.184725457s" podCreationTimestamp="2026-04-16 20:34:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:34:21.182654685 +0000 UTC m=+1361.717010413" watchObservedRunningTime="2026-04-16 20:34:21.184725457 +0000 UTC m=+1361.719081182" Apr 16 20:34:25.152633 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.152601 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz"] Apr 16 20:34:25.156927 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.156903 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.159433 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.159410 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-87wkr\"" Apr 16 20:34:25.159558 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.159514 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 20:34:25.165642 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.165619 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz"] Apr 16 20:34:25.308092 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.308054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r88c4\" (UniqueName: \"kubernetes.io/projected/e49888ec-b241-4508-a292-23797b64018c-kube-api-access-r88c4\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.308273 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.308113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e49888ec-b241-4508-a292-23797b64018c-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.308273 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.308146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.308273 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.308181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.308273 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.308230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.308273 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.308256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.408859 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.408792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e49888ec-b241-4508-a292-23797b64018c-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.408859 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.408831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.409085 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.408870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.409085 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.408897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.409085 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.408943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.409085 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.409026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r88c4\" (UniqueName: \"kubernetes.io/projected/e49888ec-b241-4508-a292-23797b64018c-kube-api-access-r88c4\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.409481 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.409424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.409481 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.409477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.409624 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.409518 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.409624 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.409589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.411384 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.411353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e49888ec-b241-4508-a292-23797b64018c-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.417522 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.417499 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r88c4\" (UniqueName: \"kubernetes.io/projected/e49888ec-b241-4508-a292-23797b64018c-kube-api-access-r88c4\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.470899 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.470874 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:25.671553 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:25.671532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz"] Apr 16 20:34:25.674469 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:34:25.674440 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode49888ec_b241_4508_a292_23797b64018c.slice/crio-7ab2ccbaec4c52ecebbd5292368ed74957bfa00de7cf095a1f4239458f5e6743 WatchSource:0}: Error finding container 7ab2ccbaec4c52ecebbd5292368ed74957bfa00de7cf095a1f4239458f5e6743: Status 404 returned error can't find the container with id 7ab2ccbaec4c52ecebbd5292368ed74957bfa00de7cf095a1f4239458f5e6743 Apr 16 20:34:26.182134 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:26.182098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" event={"ID":"e49888ec-b241-4508-a292-23797b64018c","Type":"ContainerStarted","Data":"f18dc5afbe457edbea3e2bdbcb1bfeca1a0fb3d20648064c778f7998e3681a0d"} Apr 16 20:34:26.182134 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:26.182143 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" event={"ID":"e49888ec-b241-4508-a292-23797b64018c","Type":"ContainerStarted","Data":"7ab2ccbaec4c52ecebbd5292368ed74957bfa00de7cf095a1f4239458f5e6743"} Apr 16 20:34:27.186930 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:27.186893 2576 generic.go:358] "Generic (PLEG): container finished" podID="e49888ec-b241-4508-a292-23797b64018c" containerID="f18dc5afbe457edbea3e2bdbcb1bfeca1a0fb3d20648064c778f7998e3681a0d" exitCode=0 Apr 16 20:34:27.187379 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:27.187000 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" event={"ID":"e49888ec-b241-4508-a292-23797b64018c","Type":"ContainerDied","Data":"f18dc5afbe457edbea3e2bdbcb1bfeca1a0fb3d20648064c778f7998e3681a0d"} Apr 16 20:34:28.191812 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:28.191774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" event={"ID":"e49888ec-b241-4508-a292-23797b64018c","Type":"ContainerStarted","Data":"a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d"} Apr 16 20:34:28.191812 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:28.191810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" event={"ID":"e49888ec-b241-4508-a292-23797b64018c","Type":"ContainerStarted","Data":"242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5"} Apr 16 20:34:28.192331 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:28.191956 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:28.233916 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:28.233864 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" podStartSLOduration=3.233850204 podStartE2EDuration="3.233850204s" podCreationTimestamp="2026-04-16 20:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:34:28.231974144 +0000 UTC m=+1368.766329881" watchObservedRunningTime="2026-04-16 20:34:28.233850204 +0000 UTC m=+1368.768205929" Apr 16 20:34:29.086405 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:29.086368 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:29.086598 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:29.086416 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:29.088861 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:29.088836 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:29.195761 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:29.195730 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:35.471484 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:35.471455 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:35.471920 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:35.471771 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:35.474037 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:35.474015 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:36.219665 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:36.219643 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:34:50.198605 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:50.198575 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:34:58.226063 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:34:58.226039 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:36:12.631143 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:12.631099 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz"] Apr 16 20:36:12.631634 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:12.631435 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" podUID="e49888ec-b241-4508-a292-23797b64018c" containerName="main" containerID="cri-o://242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5" gracePeriod=30 Apr 16 20:36:12.631634 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:12.631477 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" podUID="e49888ec-b241-4508-a292-23797b64018c" containerName="tokenizer" containerID="cri-o://a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d" gracePeriod=30 Apr 16 20:36:13.499498 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.499466 2576 generic.go:358] "Generic (PLEG): container finished" podID="e49888ec-b241-4508-a292-23797b64018c" containerID="242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5" exitCode=0 Apr 16 20:36:13.499659 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.499539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" event={"ID":"e49888ec-b241-4508-a292-23797b64018c","Type":"ContainerDied","Data":"242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5"} Apr 16 20:36:13.675506 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.675486 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:36:13.773718 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.773653 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r88c4\" (UniqueName: \"kubernetes.io/projected/e49888ec-b241-4508-a292-23797b64018c-kube-api-access-r88c4\") pod \"e49888ec-b241-4508-a292-23797b64018c\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " Apr 16 20:36:13.773853 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.773723 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e49888ec-b241-4508-a292-23797b64018c-tls-certs\") pod \"e49888ec-b241-4508-a292-23797b64018c\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " Apr 16 20:36:13.773853 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.773758 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-kserve-provision-location\") pod \"e49888ec-b241-4508-a292-23797b64018c\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " Apr 16 20:36:13.773853 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.773784 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-cache\") pod \"e49888ec-b241-4508-a292-23797b64018c\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " Apr 16 20:36:13.773853 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.773839 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-tmp\") pod \"e49888ec-b241-4508-a292-23797b64018c\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " Apr 16 20:36:13.774073 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.773872 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-uds\") pod \"e49888ec-b241-4508-a292-23797b64018c\" (UID: \"e49888ec-b241-4508-a292-23797b64018c\") " Apr 16 20:36:13.774122 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.774098 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e49888ec-b241-4508-a292-23797b64018c" (UID: "e49888ec-b241-4508-a292-23797b64018c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:13.774247 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.774217 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e49888ec-b241-4508-a292-23797b64018c" (UID: "e49888ec-b241-4508-a292-23797b64018c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:13.774311 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.774252 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e49888ec-b241-4508-a292-23797b64018c" (UID: "e49888ec-b241-4508-a292-23797b64018c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:13.774531 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.774510 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e49888ec-b241-4508-a292-23797b64018c" (UID: "e49888ec-b241-4508-a292-23797b64018c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:13.775744 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.775728 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49888ec-b241-4508-a292-23797b64018c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e49888ec-b241-4508-a292-23797b64018c" (UID: "e49888ec-b241-4508-a292-23797b64018c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:36:13.775861 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.775844 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49888ec-b241-4508-a292-23797b64018c-kube-api-access-r88c4" (OuterVolumeSpecName: "kube-api-access-r88c4") pod "e49888ec-b241-4508-a292-23797b64018c" (UID: "e49888ec-b241-4508-a292-23797b64018c"). InnerVolumeSpecName "kube-api-access-r88c4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:36:13.874677 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.874650 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e49888ec-b241-4508-a292-23797b64018c-tls-certs\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:36:13.874677 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.874673 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-kserve-provision-location\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:36:13.874813 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.874684 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-cache\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:36:13.874813 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.874693 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-tmp\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:36:13.874813 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.874702 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e49888ec-b241-4508-a292-23797b64018c-tokenizer-uds\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:36:13.874813 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:13.874711 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r88c4\" (UniqueName: \"kubernetes.io/projected/e49888ec-b241-4508-a292-23797b64018c-kube-api-access-r88c4\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:36:14.503960 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.503929 2576 generic.go:358] "Generic (PLEG): container finished" podID="e49888ec-b241-4508-a292-23797b64018c" containerID="a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d" exitCode=0 Apr 16 20:36:14.504114 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.504003 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" event={"ID":"e49888ec-b241-4508-a292-23797b64018c","Type":"ContainerDied","Data":"a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d"} Apr 16 20:36:14.504114 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.504022 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" Apr 16 20:36:14.504114 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.504029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz" event={"ID":"e49888ec-b241-4508-a292-23797b64018c","Type":"ContainerDied","Data":"7ab2ccbaec4c52ecebbd5292368ed74957bfa00de7cf095a1f4239458f5e6743"} Apr 16 20:36:14.504114 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.504045 2576 scope.go:117] "RemoveContainer" containerID="a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d" Apr 16 20:36:14.511791 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.511774 2576 scope.go:117] "RemoveContainer" containerID="242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5" Apr 16 20:36:14.518135 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.518119 2576 scope.go:117] "RemoveContainer" containerID="f18dc5afbe457edbea3e2bdbcb1bfeca1a0fb3d20648064c778f7998e3681a0d" Apr 16 20:36:14.522043 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.522022 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz"] Apr 16 20:36:14.526056 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.524937 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche8jclz"] Apr 16 20:36:14.527567 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.527552 2576 scope.go:117] "RemoveContainer" containerID="a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d" Apr 16 20:36:14.527828 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:36:14.527811 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d\": container with ID starting with a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d not found: ID does not exist" containerID="a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d" Apr 16 20:36:14.527891 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.527836 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d"} err="failed to get container status \"a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d\": rpc error: code = NotFound desc = could not find container \"a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d\": container with ID starting with a9f2e7ccff9ae59d2ac09d2216ddfb74bbb1287f4085e7739d2440159765466d not found: ID does not exist" Apr 16 20:36:14.527891 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.527851 2576 scope.go:117] "RemoveContainer" containerID="242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5" Apr 16 20:36:14.528102 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:36:14.528088 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5\": container with ID starting with 242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5 not found: ID does not exist" containerID="242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5" Apr 16 20:36:14.528152 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.528106 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5"} err="failed to get container status \"242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5\": rpc error: code = NotFound desc = could not find container \"242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5\": container with ID starting with 242c702d33ae3b607347a7e94652d5a7f397a45e9d12c4fe7ebe84a508c29ba5 not found: ID does not exist" Apr 16 20:36:14.528152 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.528118 2576 scope.go:117] "RemoveContainer" containerID="f18dc5afbe457edbea3e2bdbcb1bfeca1a0fb3d20648064c778f7998e3681a0d" Apr 16 20:36:14.528298 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:36:14.528283 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f18dc5afbe457edbea3e2bdbcb1bfeca1a0fb3d20648064c778f7998e3681a0d\": container with ID starting with f18dc5afbe457edbea3e2bdbcb1bfeca1a0fb3d20648064c778f7998e3681a0d not found: ID does not exist" containerID="f18dc5afbe457edbea3e2bdbcb1bfeca1a0fb3d20648064c778f7998e3681a0d" Apr 16 20:36:14.528334 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:14.528301 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f18dc5afbe457edbea3e2bdbcb1bfeca1a0fb3d20648064c778f7998e3681a0d"} err="failed to get container status \"f18dc5afbe457edbea3e2bdbcb1bfeca1a0fb3d20648064c778f7998e3681a0d\": rpc error: code = NotFound desc = could not find container \"f18dc5afbe457edbea3e2bdbcb1bfeca1a0fb3d20648064c778f7998e3681a0d\": container with ID starting with f18dc5afbe457edbea3e2bdbcb1bfeca1a0fb3d20648064c778f7998e3681a0d not found: ID does not exist" Apr 16 20:36:16.058679 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:36:16.058648 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49888ec-b241-4508-a292-23797b64018c" path="/var/lib/kubelet/pods/e49888ec-b241-4508-a292-23797b64018c/volumes" Apr 16 20:37:20.854282 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:20.854247 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm"] Apr 16 20:37:20.854811 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:20.854659 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" podUID="476b0a54-5285-4647-8df2-0b213b567168" containerName="main" containerID="cri-o://9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a" gracePeriod=30 Apr 16 20:37:20.855020 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:20.854995 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" podUID="476b0a54-5285-4647-8df2-0b213b567168" containerName="tokenizer" containerID="cri-o://d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885" gracePeriod=30 Apr 16 20:37:21.709083 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:21.709036 2576 generic.go:358] "Generic (PLEG): container finished" podID="476b0a54-5285-4647-8df2-0b213b567168" containerID="9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a" exitCode=0 Apr 16 20:37:21.709247 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:21.709112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" event={"ID":"476b0a54-5285-4647-8df2-0b213b567168","Type":"ContainerDied","Data":"9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a"} Apr 16 20:37:22.006047 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.006025 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:37:22.047477 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.047447 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-uds\") pod \"476b0a54-5285-4647-8df2-0b213b567168\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " Apr 16 20:37:22.047588 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.047481 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd2hr\" (UniqueName: \"kubernetes.io/projected/476b0a54-5285-4647-8df2-0b213b567168-kube-api-access-fd2hr\") pod \"476b0a54-5285-4647-8df2-0b213b567168\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " Apr 16 20:37:22.047588 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.047524 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-cache\") pod \"476b0a54-5285-4647-8df2-0b213b567168\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " Apr 16 20:37:22.047588 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.047580 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-tmp\") pod \"476b0a54-5285-4647-8df2-0b213b567168\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " Apr 16 20:37:22.047726 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.047629 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-kserve-provision-location\") pod \"476b0a54-5285-4647-8df2-0b213b567168\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " Apr 16 20:37:22.047726 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.047676 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/476b0a54-5285-4647-8df2-0b213b567168-tls-certs\") pod \"476b0a54-5285-4647-8df2-0b213b567168\" (UID: \"476b0a54-5285-4647-8df2-0b213b567168\") " Apr 16 20:37:22.047882 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.047705 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "476b0a54-5285-4647-8df2-0b213b567168" (UID: "476b0a54-5285-4647-8df2-0b213b567168"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:22.047882 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.047786 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "476b0a54-5285-4647-8df2-0b213b567168" (UID: "476b0a54-5285-4647-8df2-0b213b567168"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:22.047882 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.047868 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-uds\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:37:22.047882 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.047880 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-cache\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:37:22.048075 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.047950 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "476b0a54-5285-4647-8df2-0b213b567168" (UID: "476b0a54-5285-4647-8df2-0b213b567168"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:22.048272 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.048245 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "476b0a54-5285-4647-8df2-0b213b567168" (UID: "476b0a54-5285-4647-8df2-0b213b567168"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:22.049686 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.049653 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/476b0a54-5285-4647-8df2-0b213b567168-kube-api-access-fd2hr" (OuterVolumeSpecName: "kube-api-access-fd2hr") pod "476b0a54-5285-4647-8df2-0b213b567168" (UID: "476b0a54-5285-4647-8df2-0b213b567168"). InnerVolumeSpecName "kube-api-access-fd2hr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:37:22.050006 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.049962 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/476b0a54-5285-4647-8df2-0b213b567168-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "476b0a54-5285-4647-8df2-0b213b567168" (UID: "476b0a54-5285-4647-8df2-0b213b567168"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:37:22.148316 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.148285 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-kserve-provision-location\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:37:22.148316 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.148316 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/476b0a54-5285-4647-8df2-0b213b567168-tls-certs\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:37:22.148480 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.148331 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fd2hr\" (UniqueName: \"kubernetes.io/projected/476b0a54-5285-4647-8df2-0b213b567168-kube-api-access-fd2hr\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:37:22.148480 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.148344 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/476b0a54-5285-4647-8df2-0b213b567168-tokenizer-tmp\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:37:22.714335 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.714297 2576 generic.go:358] "Generic (PLEG): container finished" podID="476b0a54-5285-4647-8df2-0b213b567168" containerID="d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885" exitCode=0 Apr 16 20:37:22.714500 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.714366 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" Apr 16 20:37:22.714500 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.714367 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" event={"ID":"476b0a54-5285-4647-8df2-0b213b567168","Type":"ContainerDied","Data":"d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885"} Apr 16 20:37:22.714500 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.714472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm" event={"ID":"476b0a54-5285-4647-8df2-0b213b567168","Type":"ContainerDied","Data":"dbcead9e585b2c7731a0682c925f7e5921895aa07550db7f6f014be83031e93a"} Apr 16 20:37:22.714500 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.714487 2576 scope.go:117] "RemoveContainer" containerID="d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885" Apr 16 20:37:22.722251 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.722225 2576 scope.go:117] "RemoveContainer" containerID="9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a" Apr 16 20:37:22.729159 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.729138 2576 scope.go:117] "RemoveContainer" containerID="246619f077559e9b5c4bdccad9b938f50346369c7bd81875ee1dcc434a9e4f09" Apr 16 20:37:22.731946 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.731922 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm"] Apr 16 20:37:22.735959 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.735938 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6bc78mgttm"] Apr 16 20:37:22.736593 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.736572 2576 scope.go:117] "RemoveContainer" containerID="d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885" Apr 16 20:37:22.736833 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:37:22.736815 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885\": container with ID starting with d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885 not found: ID does not exist" containerID="d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885" Apr 16 20:37:22.736906 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.736846 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885"} err="failed to get container status \"d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885\": rpc error: code = NotFound desc = could not find container \"d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885\": container with ID starting with d1e23f80719b750bf08657c74d755b6f7bfa25e180a6aefdf4588432bde1e885 not found: ID does not exist" Apr 16 20:37:22.736906 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.736870 2576 scope.go:117] "RemoveContainer" containerID="9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a" Apr 16 20:37:22.737140 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:37:22.737119 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a\": container with ID starting with 9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a not found: ID does not exist" containerID="9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a" Apr 16 20:37:22.737230 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.737146 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a"} err="failed to get container status \"9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a\": rpc error: code = NotFound desc = could not find container \"9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a\": container with ID starting with 9ef5c357c986d35a9164aa423424f8b02b4a81c5a0e1850cb293dc0a5437bb7a not found: ID does not exist" Apr 16 20:37:22.737230 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.737162 2576 scope.go:117] "RemoveContainer" containerID="246619f077559e9b5c4bdccad9b938f50346369c7bd81875ee1dcc434a9e4f09" Apr 16 20:37:22.737418 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:37:22.737398 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246619f077559e9b5c4bdccad9b938f50346369c7bd81875ee1dcc434a9e4f09\": container with ID starting with 246619f077559e9b5c4bdccad9b938f50346369c7bd81875ee1dcc434a9e4f09 not found: ID does not exist" containerID="246619f077559e9b5c4bdccad9b938f50346369c7bd81875ee1dcc434a9e4f09" Apr 16 20:37:22.737538 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:22.737422 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246619f077559e9b5c4bdccad9b938f50346369c7bd81875ee1dcc434a9e4f09"} err="failed to get container status \"246619f077559e9b5c4bdccad9b938f50346369c7bd81875ee1dcc434a9e4f09\": rpc error: code = NotFound desc = could not find container \"246619f077559e9b5c4bdccad9b938f50346369c7bd81875ee1dcc434a9e4f09\": container with ID starting with 246619f077559e9b5c4bdccad9b938f50346369c7bd81875ee1dcc434a9e4f09 not found: ID does not exist" Apr 16 20:37:24.059064 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:24.059038 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="476b0a54-5285-4647-8df2-0b213b567168" path="/var/lib/kubelet/pods/476b0a54-5285-4647-8df2-0b213b567168/volumes" Apr 16 20:37:43.013344 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013311 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz"] Apr 16 20:37:43.013681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013580 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e49888ec-b241-4508-a292-23797b64018c" containerName="storage-initializer" Apr 16 20:37:43.013681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013591 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49888ec-b241-4508-a292-23797b64018c" containerName="storage-initializer" Apr 16 20:37:43.013681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013599 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="476b0a54-5285-4647-8df2-0b213b567168" containerName="storage-initializer" Apr 16 20:37:43.013681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013605 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="476b0a54-5285-4647-8df2-0b213b567168" containerName="storage-initializer" Apr 16 20:37:43.013681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013615 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="476b0a54-5285-4647-8df2-0b213b567168" containerName="main" Apr 16 20:37:43.013681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013619 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="476b0a54-5285-4647-8df2-0b213b567168" containerName="main" Apr 16 20:37:43.013681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013628 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e49888ec-b241-4508-a292-23797b64018c" containerName="tokenizer" Apr 16 20:37:43.013681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013635 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49888ec-b241-4508-a292-23797b64018c" containerName="tokenizer" Apr 16 20:37:43.013681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013645 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e49888ec-b241-4508-a292-23797b64018c" containerName="main" Apr 16 20:37:43.013681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013650 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49888ec-b241-4508-a292-23797b64018c" containerName="main" Apr 16 20:37:43.013681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013661 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="476b0a54-5285-4647-8df2-0b213b567168" containerName="tokenizer" Apr 16 20:37:43.013681 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013666 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="476b0a54-5285-4647-8df2-0b213b567168" containerName="tokenizer" Apr 16 20:37:43.014053 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013708 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="476b0a54-5285-4647-8df2-0b213b567168" containerName="tokenizer" Apr 16 20:37:43.014053 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013716 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e49888ec-b241-4508-a292-23797b64018c" containerName="main" Apr 16 20:37:43.014053 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013722 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e49888ec-b241-4508-a292-23797b64018c" containerName="tokenizer" Apr 16 20:37:43.014053 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.013729 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="476b0a54-5285-4647-8df2-0b213b567168" containerName="main" Apr 16 20:37:43.017818 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.017800 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.020658 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.020636 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 20:37:43.020750 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.020689 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:37:43.020750 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.020649 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-rwvsc\"" Apr 16 20:37:43.027264 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.027245 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz"] Apr 16 20:37:43.091721 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.091694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.091823 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.091725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.091823 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.091754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10fde807-b49b-4041-aeca-91f040030488-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.091903 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.091821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.091903 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.091842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.091903 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.091859 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2v9r\" (UniqueName: \"kubernetes.io/projected/10fde807-b49b-4041-aeca-91f040030488-kube-api-access-w2v9r\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.192580 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.192552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.192580 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.192579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2v9r\" (UniqueName: \"kubernetes.io/projected/10fde807-b49b-4041-aeca-91f040030488-kube-api-access-w2v9r\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.192748 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.192643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.192748 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.192673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.192852 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.192832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10fde807-b49b-4041-aeca-91f040030488-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.192908 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.192880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.193040 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.192963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.193114 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.193041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.193214 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.193195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.193338 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.193320 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.195183 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.195167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10fde807-b49b-4041-aeca-91f040030488-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.199993 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.199961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2v9r\" (UniqueName: \"kubernetes.io/projected/10fde807-b49b-4041-aeca-91f040030488-kube-api-access-w2v9r\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.327202 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.327149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:43.442947 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.442917 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz"] Apr 16 20:37:43.447871 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:37:43.447842 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10fde807_b49b_4041_aeca_91f040030488.slice/crio-05960200c2253cf7b2217f7d1738340d1472232b01e7f26390d2c81820333d5c WatchSource:0}: Error finding container 05960200c2253cf7b2217f7d1738340d1472232b01e7f26390d2c81820333d5c: Status 404 returned error can't find the container with id 05960200c2253cf7b2217f7d1738340d1472232b01e7f26390d2c81820333d5c Apr 16 20:37:43.450024 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.450007 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:37:43.784509 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.784478 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" event={"ID":"10fde807-b49b-4041-aeca-91f040030488","Type":"ContainerStarted","Data":"db73069b76ab006904690d29d3b22c4d6a074f789c3f08917137b7cd7fda3bba"} Apr 16 20:37:43.784683 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:43.784515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" event={"ID":"10fde807-b49b-4041-aeca-91f040030488","Type":"ContainerStarted","Data":"05960200c2253cf7b2217f7d1738340d1472232b01e7f26390d2c81820333d5c"} Apr 16 20:37:44.788588 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:44.788552 2576 generic.go:358] "Generic (PLEG): container finished" podID="10fde807-b49b-4041-aeca-91f040030488" containerID="db73069b76ab006904690d29d3b22c4d6a074f789c3f08917137b7cd7fda3bba" exitCode=0 Apr 16 20:37:44.789020 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:44.788621 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" event={"ID":"10fde807-b49b-4041-aeca-91f040030488","Type":"ContainerDied","Data":"db73069b76ab006904690d29d3b22c4d6a074f789c3f08917137b7cd7fda3bba"} Apr 16 20:37:45.794332 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:45.794296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" event={"ID":"10fde807-b49b-4041-aeca-91f040030488","Type":"ContainerStarted","Data":"46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2"} Apr 16 20:37:45.794332 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:45.794333 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" event={"ID":"10fde807-b49b-4041-aeca-91f040030488","Type":"ContainerStarted","Data":"c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4"} Apr 16 20:37:45.794732 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:45.794435 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:45.815195 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:45.815149 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" podStartSLOduration=3.815136796 podStartE2EDuration="3.815136796s" podCreationTimestamp="2026-04-16 20:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:37:45.81352402 +0000 UTC m=+1566.347879745" watchObservedRunningTime="2026-04-16 20:37:45.815136796 +0000 UTC m=+1566.349492521" Apr 16 20:37:53.327802 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:53.327768 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:53.327802 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:53.327809 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:53.330364 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:53.330340 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:37:53.823790 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:37:53.823768 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:38:14.828044 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:38:14.828017 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:44:18.338223 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:18.338188 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz"] Apr 16 20:44:18.340594 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:18.338594 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" podUID="10fde807-b49b-4041-aeca-91f040030488" containerName="main" containerID="cri-o://c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4" gracePeriod=30 Apr 16 20:44:18.340594 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:18.338643 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" podUID="10fde807-b49b-4041-aeca-91f040030488" containerName="tokenizer" containerID="cri-o://46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2" gracePeriod=30 Apr 16 20:44:18.967182 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:18.967148 2576 generic.go:358] "Generic (PLEG): container finished" podID="10fde807-b49b-4041-aeca-91f040030488" containerID="c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4" exitCode=0 Apr 16 20:44:18.967358 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:18.967228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" event={"ID":"10fde807-b49b-4041-aeca-91f040030488","Type":"ContainerDied","Data":"c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4"} Apr 16 20:44:19.373502 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.373482 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:44:19.493519 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.493457 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-kserve-provision-location\") pod \"10fde807-b49b-4041-aeca-91f040030488\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " Apr 16 20:44:19.493519 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.493500 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-uds\") pod \"10fde807-b49b-4041-aeca-91f040030488\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " Apr 16 20:44:19.493700 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.493531 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2v9r\" (UniqueName: \"kubernetes.io/projected/10fde807-b49b-4041-aeca-91f040030488-kube-api-access-w2v9r\") pod \"10fde807-b49b-4041-aeca-91f040030488\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " Apr 16 20:44:19.493700 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.493560 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10fde807-b49b-4041-aeca-91f040030488-tls-certs\") pod \"10fde807-b49b-4041-aeca-91f040030488\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " Apr 16 20:44:19.493700 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.493575 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-tmp\") pod \"10fde807-b49b-4041-aeca-91f040030488\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " Apr 16 20:44:19.493700 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.493598 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-cache\") pod \"10fde807-b49b-4041-aeca-91f040030488\" (UID: \"10fde807-b49b-4041-aeca-91f040030488\") " Apr 16 20:44:19.493910 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.493849 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "10fde807-b49b-4041-aeca-91f040030488" (UID: "10fde807-b49b-4041-aeca-91f040030488"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:19.493964 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.493943 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "10fde807-b49b-4041-aeca-91f040030488" (UID: "10fde807-b49b-4041-aeca-91f040030488"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:19.494027 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.493961 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "10fde807-b49b-4041-aeca-91f040030488" (UID: "10fde807-b49b-4041-aeca-91f040030488"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:19.494252 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.494227 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "10fde807-b49b-4041-aeca-91f040030488" (UID: "10fde807-b49b-4041-aeca-91f040030488"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:19.495543 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.495509 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10fde807-b49b-4041-aeca-91f040030488-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "10fde807-b49b-4041-aeca-91f040030488" (UID: "10fde807-b49b-4041-aeca-91f040030488"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:44:19.495631 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.495587 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10fde807-b49b-4041-aeca-91f040030488-kube-api-access-w2v9r" (OuterVolumeSpecName: "kube-api-access-w2v9r") pod "10fde807-b49b-4041-aeca-91f040030488" (UID: "10fde807-b49b-4041-aeca-91f040030488"). InnerVolumeSpecName "kube-api-access-w2v9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:44:19.594200 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.594179 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10fde807-b49b-4041-aeca-91f040030488-tls-certs\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:44:19.594200 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.594198 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-tmp\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:44:19.594323 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.594206 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-cache\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:44:19.594323 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.594215 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-kserve-provision-location\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:44:19.594323 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.594224 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/10fde807-b49b-4041-aeca-91f040030488-tokenizer-uds\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:44:19.594323 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.594233 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w2v9r\" (UniqueName: \"kubernetes.io/projected/10fde807-b49b-4041-aeca-91f040030488-kube-api-access-w2v9r\") on node \"ip-10-0-141-145.ec2.internal\" DevicePath \"\"" Apr 16 20:44:19.971492 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.971465 2576 generic.go:358] "Generic (PLEG): container finished" podID="10fde807-b49b-4041-aeca-91f040030488" containerID="46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2" exitCode=0 Apr 16 20:44:19.971621 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.971539 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" Apr 16 20:44:19.971621 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.971551 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" event={"ID":"10fde807-b49b-4041-aeca-91f040030488","Type":"ContainerDied","Data":"46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2"} Apr 16 20:44:19.971621 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.971589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz" event={"ID":"10fde807-b49b-4041-aeca-91f040030488","Type":"ContainerDied","Data":"05960200c2253cf7b2217f7d1738340d1472232b01e7f26390d2c81820333d5c"} Apr 16 20:44:19.971621 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.971605 2576 scope.go:117] "RemoveContainer" containerID="46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2" Apr 16 20:44:19.980395 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.980364 2576 scope.go:117] "RemoveContainer" containerID="c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4" Apr 16 20:44:19.987091 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.987070 2576 scope.go:117] "RemoveContainer" containerID="db73069b76ab006904690d29d3b22c4d6a074f789c3f08917137b7cd7fda3bba" Apr 16 20:44:19.993216 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.993194 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz"] Apr 16 20:44:19.994186 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.994171 2576 scope.go:117] "RemoveContainer" containerID="46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2" Apr 16 20:44:19.994481 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:44:19.994447 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2\": container with ID starting with 46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2 not found: ID does not exist" containerID="46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2" Apr 16 20:44:19.994556 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.994485 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2"} err="failed to get container status \"46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2\": rpc error: code = NotFound desc = could not find container \"46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2\": container with ID starting with 46369e9da96f7847ad1fda0e22793f33ee2db96192a6f5f767cfac8f55faa8e2 not found: ID does not exist" Apr 16 20:44:19.994556 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.994500 2576 scope.go:117] "RemoveContainer" containerID="c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4" Apr 16 20:44:19.994706 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:44:19.994688 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4\": container with ID starting with c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4 not found: ID does not exist" containerID="c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4" Apr 16 20:44:19.994744 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.994712 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4"} err="failed to get container status \"c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4\": rpc error: code = NotFound desc = could not find container \"c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4\": container with ID starting with c0325c42ab24b4e040edeb0e4ae1b0d9506fee3c9466396fa01e3800a4be7ed4 not found: ID does not exist" Apr 16 20:44:19.994744 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.994730 2576 scope.go:117] "RemoveContainer" containerID="db73069b76ab006904690d29d3b22c4d6a074f789c3f08917137b7cd7fda3bba" Apr 16 20:44:19.994918 ip-10-0-141-145 kubenswrapper[2576]: E0416 20:44:19.994904 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db73069b76ab006904690d29d3b22c4d6a074f789c3f08917137b7cd7fda3bba\": container with ID starting with db73069b76ab006904690d29d3b22c4d6a074f789c3f08917137b7cd7fda3bba not found: ID does not exist" containerID="db73069b76ab006904690d29d3b22c4d6a074f789c3f08917137b7cd7fda3bba" Apr 16 20:44:19.994955 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.994922 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db73069b76ab006904690d29d3b22c4d6a074f789c3f08917137b7cd7fda3bba"} err="failed to get container status \"db73069b76ab006904690d29d3b22c4d6a074f789c3f08917137b7cd7fda3bba\": rpc error: code = NotFound desc = could not find container \"db73069b76ab006904690d29d3b22c4d6a074f789c3f08917137b7cd7fda3bba\": container with ID starting with db73069b76ab006904690d29d3b22c4d6a074f789c3f08917137b7cd7fda3bba not found: ID does not exist" Apr 16 20:44:19.998441 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:19.998422 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b455b6f6r9rtz"] Apr 16 20:44:20.058404 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:20.058382 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10fde807-b49b-4041-aeca-91f040030488" path="/var/lib/kubelet/pods/10fde807-b49b-4041-aeca-91f040030488/volumes" Apr 16 20:44:33.539091 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:33.539048 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:34.576419 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:34.576392 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:35.607646 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:35.607618 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:36.591898 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:36.591861 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:37.584930 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:37.584898 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:38.577789 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:38.577753 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:39.579489 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:39.579462 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:40.600671 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:40.600636 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:41.700480 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:41.700446 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:42.746694 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:42.746663 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:43.749315 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:43.749284 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:44.785064 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:44.785037 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:45.804716 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:45.804689 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:46.883154 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:46.883126 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ctvvf_89fab457-9788-4391-ac81-da6b2b19ffdd/istio-proxy/0.log" Apr 16 20:44:48.022318 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:48.022280 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-qz9fk_d7ec2a16-7abe-4726-8c1e-55762256be38/discovery/0.log" Apr 16 20:44:48.036895 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:48.036876 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-n9qgf_d252744b-82fe-4054-8a24-f06d1b966faa/istio-proxy/0.log" Apr 16 20:44:48.862062 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:48.862036 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-qz9fk_d7ec2a16-7abe-4726-8c1e-55762256be38/discovery/0.log" Apr 16 20:44:48.877157 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:48.877136 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-n9qgf_d252744b-82fe-4054-8a24-f06d1b966faa/istio-proxy/0.log" Apr 16 20:44:49.686046 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:49.686021 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-f7wmw_6fe35cd4-e497-4c2f-99f8-55304c5d108f/manager/0.log" Apr 16 20:44:49.802907 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:49.802878 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rkbk9_cf1453ee-5268-4494-8172-413ee64649c8/manager/0.log" Apr 16 20:44:55.194606 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:55.194574 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gj7ch_299edf22-1b19-43c6-aa15-8124389d617a/global-pull-secret-syncer/0.log" Apr 16 20:44:55.261879 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:55.261853 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7px7k_8346d3a3-b207-4963-8e85-a3cae84eb2ae/konnectivity-agent/0.log" Apr 16 20:44:55.396310 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:55.396280 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-145.ec2.internal_ea24cdd89c298d8269b8d7acdbb62d3a/haproxy/0.log" Apr 16 20:44:59.108954 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:59.108921 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-f7wmw_6fe35cd4-e497-4c2f-99f8-55304c5d108f/manager/0.log" Apr 16 20:44:59.392706 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:44:59.392621 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rkbk9_cf1453ee-5268-4494-8172-413ee64649c8/manager/0.log" Apr 16 20:45:01.036308 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:01.036274 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zgmbr_5aa991c5-99cc-4458-a0d4-0f785bb5b8cd/node-exporter/0.log" Apr 16 20:45:01.057443 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:01.057416 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zgmbr_5aa991c5-99cc-4458-a0d4-0f785bb5b8cd/kube-rbac-proxy/0.log" Apr 16 20:45:01.079123 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:01.079104 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zgmbr_5aa991c5-99cc-4458-a0d4-0f785bb5b8cd/init-textfile/0.log" Apr 16 20:45:03.894040 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.894005 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd"] Apr 16 20:45:03.894423 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.894341 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10fde807-b49b-4041-aeca-91f040030488" containerName="main" Apr 16 20:45:03.894423 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.894353 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="10fde807-b49b-4041-aeca-91f040030488" containerName="main" Apr 16 20:45:03.894423 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.894363 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10fde807-b49b-4041-aeca-91f040030488" containerName="tokenizer" Apr 16 20:45:03.894423 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.894369 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="10fde807-b49b-4041-aeca-91f040030488" containerName="tokenizer" Apr 16 20:45:03.894423 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.894380 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10fde807-b49b-4041-aeca-91f040030488" containerName="storage-initializer" Apr 16 20:45:03.894423 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.894385 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="10fde807-b49b-4041-aeca-91f040030488" containerName="storage-initializer" Apr 16 20:45:03.894606 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.894440 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="10fde807-b49b-4041-aeca-91f040030488" containerName="tokenizer" Apr 16 20:45:03.894606 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.894449 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="10fde807-b49b-4041-aeca-91f040030488" containerName="main" Apr 16 20:45:03.897412 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.897391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:03.899657 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.899632 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d58bg\"/\"openshift-service-ca.crt\"" Apr 16 20:45:03.900316 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.900297 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d58bg\"/\"kube-root-ca.crt\"" Apr 16 20:45:03.900363 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.900297 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d58bg\"/\"default-dockercfg-fs8m8\"" Apr 16 20:45:03.906741 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:03.906715 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd"] Apr 16 20:45:04.006138 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.006116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/50217af2-8b3d-45bd-b866-b87f207b94c0-podres\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.006243 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.006146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68hbn\" (UniqueName: \"kubernetes.io/projected/50217af2-8b3d-45bd-b866-b87f207b94c0-kube-api-access-68hbn\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.006243 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.006167 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/50217af2-8b3d-45bd-b866-b87f207b94c0-proc\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.006243 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.006215 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50217af2-8b3d-45bd-b866-b87f207b94c0-lib-modules\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.006339 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.006305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50217af2-8b3d-45bd-b866-b87f207b94c0-sys\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.107231 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.107198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50217af2-8b3d-45bd-b866-b87f207b94c0-sys\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.107231 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.107233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/50217af2-8b3d-45bd-b866-b87f207b94c0-podres\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.107423 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.107255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68hbn\" (UniqueName: \"kubernetes.io/projected/50217af2-8b3d-45bd-b866-b87f207b94c0-kube-api-access-68hbn\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.107423 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.107280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/50217af2-8b3d-45bd-b866-b87f207b94c0-proc\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.107423 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.107313 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50217af2-8b3d-45bd-b866-b87f207b94c0-lib-modules\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.107423 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.107321 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50217af2-8b3d-45bd-b866-b87f207b94c0-sys\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.107423 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.107387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/50217af2-8b3d-45bd-b866-b87f207b94c0-proc\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.107423 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.107403 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/50217af2-8b3d-45bd-b866-b87f207b94c0-podres\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.107685 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.107470 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50217af2-8b3d-45bd-b866-b87f207b94c0-lib-modules\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.114894 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.114872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68hbn\" (UniqueName: \"kubernetes.io/projected/50217af2-8b3d-45bd-b866-b87f207b94c0-kube-api-access-68hbn\") pod \"perf-node-gather-daemonset-kjlvd\" (UID: \"50217af2-8b3d-45bd-b866-b87f207b94c0\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.208128 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.208108 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:04.322836 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.322810 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd"] Apr 16 20:45:04.324946 ip-10-0-141-145 kubenswrapper[2576]: W0416 20:45:04.324919 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod50217af2_8b3d_45bd_b866_b87f207b94c0.slice/crio-214faf93eddc5800378df353607ed7ec3f5826163dd942866898219d17f203da WatchSource:0}: Error finding container 214faf93eddc5800378df353607ed7ec3f5826163dd942866898219d17f203da: Status 404 returned error can't find the container with id 214faf93eddc5800378df353607ed7ec3f5826163dd942866898219d17f203da Apr 16 20:45:04.326718 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:04.326703 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:45:05.134325 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:05.134280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" event={"ID":"50217af2-8b3d-45bd-b866-b87f207b94c0","Type":"ContainerStarted","Data":"8cf263cd1dfa8fb43a6407444d508c8f01783ddd1667a30baef3c98d88eae00b"} Apr 16 20:45:05.134325 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:05.134327 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" event={"ID":"50217af2-8b3d-45bd-b866-b87f207b94c0","Type":"ContainerStarted","Data":"214faf93eddc5800378df353607ed7ec3f5826163dd942866898219d17f203da"} Apr 16 20:45:05.134654 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:05.134418 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:05.148711 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:05.148662 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" podStartSLOduration=2.148650072 podStartE2EDuration="2.148650072s" podCreationTimestamp="2026-04-16 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:45:05.148382391 +0000 UTC m=+2005.682738117" watchObservedRunningTime="2026-04-16 20:45:05.148650072 +0000 UTC m=+2005.683005796" Apr 16 20:45:05.178485 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:05.178463 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s265g_99f215f0-5679-47ef-9cb6-fee5d63d63ac/dns/0.log" Apr 16 20:45:05.199568 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:05.199551 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s265g_99f215f0-5679-47ef-9cb6-fee5d63d63ac/kube-rbac-proxy/0.log" Apr 16 20:45:05.245751 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:05.245730 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pcr4b_cf44a204-89e9-429e-a562-8e270453e2d3/dns-node-resolver/0.log" Apr 16 20:45:05.754813 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:05.754789 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j59fj_2e2f3589-a853-4295-aedb-5f577abc797b/node-ca/0.log" Apr 16 20:45:06.607592 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:06.607562 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-qz9fk_d7ec2a16-7abe-4726-8c1e-55762256be38/discovery/0.log" Apr 16 20:45:06.632137 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:06.632110 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-n9qgf_d252744b-82fe-4054-8a24-f06d1b966faa/istio-proxy/0.log" Apr 16 20:45:07.123854 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:07.123819 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-t66hl_ae1e5364-bc6b-4206-89ef-2aeef8ddd13c/serve-healthcheck-canary/0.log" Apr 16 20:45:07.597847 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:07.597826 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2k2bj_cb305b65-f801-4d89-b18c-c2f5be7436c5/kube-rbac-proxy/0.log" Apr 16 20:45:07.619623 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:07.619592 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2k2bj_cb305b65-f801-4d89-b18c-c2f5be7436c5/exporter/0.log" Apr 16 20:45:07.641708 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:07.641684 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2k2bj_cb305b65-f801-4d89-b18c-c2f5be7436c5/extractor/0.log" Apr 16 20:45:10.327947 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:10.327916 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-sxq4d_85ee8ad9-c0fa-4b55-8366-030eac073f32/openshift-lws-operator/0.log" Apr 16 20:45:10.851059 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:10.851029 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-66cf78b85b-8bftp_93022683-4178-4a33-a1a4-6a94e6de288a/manager/0.log" Apr 16 20:45:11.131281 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:11.131224 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-9ghn5_99f4da0f-2530-4202-bffc-0147d7fb0a74/manager/0.log" Apr 16 20:45:11.146471 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:11.146447 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-kjlvd" Apr 16 20:45:17.635503 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:17.635473 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xztj4_74b468ec-61b6-4b33-96b9-598cc8545771/kube-multus-additional-cni-plugins/0.log" Apr 16 20:45:17.658893 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:17.658864 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xztj4_74b468ec-61b6-4b33-96b9-598cc8545771/egress-router-binary-copy/0.log" Apr 16 20:45:17.680598 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:17.680575 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xztj4_74b468ec-61b6-4b33-96b9-598cc8545771/cni-plugins/0.log" Apr 16 20:45:17.701020 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:17.701000 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xztj4_74b468ec-61b6-4b33-96b9-598cc8545771/bond-cni-plugin/0.log" Apr 16 20:45:17.722148 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:17.722131 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xztj4_74b468ec-61b6-4b33-96b9-598cc8545771/routeoverride-cni/0.log" Apr 16 20:45:17.743262 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:17.743248 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xztj4_74b468ec-61b6-4b33-96b9-598cc8545771/whereabouts-cni-bincopy/0.log" Apr 16 20:45:17.765166 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:17.765144 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xztj4_74b468ec-61b6-4b33-96b9-598cc8545771/whereabouts-cni/0.log" Apr 16 20:45:17.795858 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:17.795838 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmxqh_fd63426d-fc95-432e-ad91-1498d43b0e04/kube-multus/0.log" Apr 16 20:45:17.939210 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:17.939161 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-q46p8_d6bfc3ff-419c-47de-880a-e6eeafc7247a/network-metrics-daemon/0.log" Apr 16 20:45:17.960190 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:17.960158 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-q46p8_d6bfc3ff-419c-47de-880a-e6eeafc7247a/kube-rbac-proxy/0.log" Apr 16 20:45:18.768825 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:18.768800 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4t2rt_13dc8612-6ea4-453f-8d47-52023be79bf8/ovn-controller/0.log" Apr 16 20:45:18.795516 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:18.795495 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4t2rt_13dc8612-6ea4-453f-8d47-52023be79bf8/ovn-acl-logging/0.log" Apr 16 20:45:18.814460 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:18.814441 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4t2rt_13dc8612-6ea4-453f-8d47-52023be79bf8/kube-rbac-proxy-node/0.log" Apr 16 20:45:18.836328 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:18.836289 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4t2rt_13dc8612-6ea4-453f-8d47-52023be79bf8/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:45:18.857474 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:18.857454 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4t2rt_13dc8612-6ea4-453f-8d47-52023be79bf8/northd/0.log" Apr 16 20:45:18.878200 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:18.878181 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4t2rt_13dc8612-6ea4-453f-8d47-52023be79bf8/nbdb/0.log" Apr 16 20:45:18.900208 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:18.900192 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4t2rt_13dc8612-6ea4-453f-8d47-52023be79bf8/sbdb/0.log" Apr 16 20:45:18.995778 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:18.995761 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4t2rt_13dc8612-6ea4-453f-8d47-52023be79bf8/ovnkube-controller/0.log" Apr 16 20:45:20.729843 ip-10-0-141-145 kubenswrapper[2576]: I0416 20:45:20.729808 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-54bk5_958053a3-75a0-41f2-8f2b-9790c4c625d8/network-check-target-container/0.log"