Apr 22 18:44:16.041651 ip-10-0-133-84 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:44:16.041663 ip-10-0-133-84 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:44:16.041670 ip-10-0-133-84 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:44:16.041923 ip-10-0-133-84 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:44:26.281899 ip-10-0-133-84 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:44:26.281920 ip-10-0-133-84 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 129b3e1d55a84bde966989452dfc80cd -- Apr 22 18:46:44.624466 ip-10-0-133-84 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:46:45.135204 ip-10-0-133-84 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:45.135204 ip-10-0-133-84 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:46:45.135204 ip-10-0-133-84 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:45.135204 ip-10-0-133-84 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:46:45.135204 ip-10-0-133-84 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:45.139575 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.139484 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:46:45.142703 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142688 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142704 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142708 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142711 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142713 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142716 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142719 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142722 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142725 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142727 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142730 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142733 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142735 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142738 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142741 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142743 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142746 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142749 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:45.142741 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142752 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142755 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142757 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142760 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142762 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142765 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142767 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142776 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142778 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142781 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142783 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142786 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142788 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142791 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142793 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142795 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142798 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142800 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142803 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142805 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:45.143181 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142807 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142810 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142812 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142815 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142817 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142820 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142822 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142825 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142827 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142829 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142832 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142835 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142837 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142839 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142844 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142846 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142849 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142854 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142858 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:45.143665 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142861 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142864 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142868 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142871 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142873 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142876 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142879 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142882 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142884 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142886 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142889 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142891 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142894 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142896 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142899 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142903 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142906 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142909 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142911 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:45.144125 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142914 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142917 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142919 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142922 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142924 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142929 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142931 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142934 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142939 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.142943 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143378 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143385 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143389 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143392 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143395 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143397 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143401 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143403 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143406 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143408 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:45.144601 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143411 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143414 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143416 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143419 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143421 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143424 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143427 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143429 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143432 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143435 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143438 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143440 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143443 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143445 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143448 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143450 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143453 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143456 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143459 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143461 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:45.145087 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143464 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143467 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143469 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143472 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143474 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143477 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143479 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143482 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143484 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143487 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143489 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143492 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143494 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143497 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143499 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143502 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143504 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143507 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143509 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:45.145597 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143511 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143514 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143517 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143520 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143522 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143524 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143527 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143530 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143532 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143535 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143538 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143540 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143543 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143546 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143549 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143551 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143554 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143556 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143559 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143561 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:45.146065 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143565 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143568 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143571 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143575 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143578 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143580 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143583 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143586 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143588 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143591 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143593 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143596 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143598 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143601 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143603 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143607 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.143611 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143691 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143698 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143704 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:46:45.146573 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143709 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143717 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143721 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143725 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143733 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143738 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143741 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143745 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143748 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143752 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143754 2573 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143757 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143760 2573 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143763 2573 flags.go:64] FLAG: --cloud-config="" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143766 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143769 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143774 2573 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143776 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143780 2573 flags.go:64] FLAG: --config-dir="" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143782 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143786 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143790 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143793 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143796 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:46:45.147092 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143799 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143802 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143804 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143808 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143815 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143819 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143823 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143826 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143829 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143833 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143836 2573 flags.go:64] FLAG: --enable-server="true" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143839 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143843 2573 flags.go:64] FLAG: --event-burst="100" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143846 2573 flags.go:64] FLAG: --event-qps="50" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143849 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143852 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143855 2573 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143858 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143861 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143864 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143867 2573 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143870 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143873 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143876 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143879 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:46:45.147692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143881 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143885 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143887 2573 flags.go:64] FLAG: --feature-gates="" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143891 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143894 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143897 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143900 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143903 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143906 2573 flags.go:64] FLAG: --help="false" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143909 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143911 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143915 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143921 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143925 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143928 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143931 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143935 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143938 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143940 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143943 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143946 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143949 2573 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143952 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143955 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:46:45.148313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143958 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143960 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143963 2573 flags.go:64] FLAG: --lock-file="" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143966 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143969 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143971 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143976 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143979 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143982 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143984 2573 flags.go:64] FLAG: --logging-format="text" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143987 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143990 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143994 2573 flags.go:64] FLAG: --manifest-url="" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.143996 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144000 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144003 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144007 2573 flags.go:64] FLAG: --max-pods="110" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144010 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144013 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144016 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144018 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144022 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144024 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144027 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144036 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:46:45.148937 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144039 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144042 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144045 2573 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144047 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144053 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144056 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144059 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144062 2573 flags.go:64] FLAG: --port="10250" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144065 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144067 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0dc6ded109d9df762" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144070 2573 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144073 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144076 2573 flags.go:64] FLAG: --register-node="true" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144079 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144082 2573 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144085 2573 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144088 2573 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144091 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144093 2573 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144097 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144100 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144103 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144106 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144109 2573 flags.go:64] FLAG: --runonce="false" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144112 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:46:45.149539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144114 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144117 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144120 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144123 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144129 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144132 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144135 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144139 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144142 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144159 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144162 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144179 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144182 2573 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144185 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144191 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144193 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144196 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144200 2573 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144203 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144206 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144209 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144212 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144215 2573 flags.go:64] FLAG: --v="2" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144219 2573 flags.go:64] FLAG: --version="false" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144223 2573 flags.go:64] FLAG: --vmodule="" Apr 22 18:46:45.150267 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144227 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.144231 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144331 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144336 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144339 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144341 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144344 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144347 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144349 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144352 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144354 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144357 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144366 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144369 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144372 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144375 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144377 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144380 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144383 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144385 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:45.150851 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144388 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144391 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144393 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144396 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144398 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144401 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144403 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144406 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144409 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144411 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144414 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144416 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144418 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144421 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144424 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144426 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144429 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144432 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144434 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144437 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:45.151686 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144439 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144441 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144444 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144446 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144449 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144457 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144461 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144463 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144466 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144468 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144471 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144473 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144475 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144478 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144480 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144483 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144485 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144488 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144490 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144492 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:45.152576 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144495 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144497 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144500 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144502 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144504 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144508 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144511 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144513 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144517 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144520 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144523 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144526 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144529 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144531 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144534 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144537 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144540 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144543 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144554 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:45.153266 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144557 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:45.153748 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144559 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:45.153748 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144562 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:45.153748 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144564 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:45.153748 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144566 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:45.153748 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144569 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:45.153748 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144571 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:45.153748 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144574 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:45.153748 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.144576 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:45.153748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.145211 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.153816 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.153840 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153917 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153925 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153930 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153935 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153940 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153944 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153948 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153953 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153957 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153961 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153965 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153969 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153973 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153977 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153980 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153984 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153989 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:45.154121 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153993 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.153997 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154001 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154004 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154009 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154012 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154018 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154022 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154026 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154030 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154035 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154039 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154043 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154048 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154055 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154060 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154064 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154069 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154073 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154079 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:45.155119 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154083 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154088 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154092 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154096 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154100 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154107 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154114 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154119 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154125 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154131 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154136 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154141 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154145 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154150 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154154 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154158 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154162 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154186 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154190 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:45.155738 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154193 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154197 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154201 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154204 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154208 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154212 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154216 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154221 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154225 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154230 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154233 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154237 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154241 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154245 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154248 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154252 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154255 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154259 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154263 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154266 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:45.156539 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154270 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154273 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154277 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154281 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154285 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154289 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154294 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154298 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154302 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154307 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.154315 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154482 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154491 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154496 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154500 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154504 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:45.157295 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154509 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154513 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154517 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154521 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154526 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154532 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154536 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154540 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154545 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154549 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154553 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154557 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154561 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154566 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154569 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154573 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154578 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154582 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154586 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154590 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:45.157787 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154593 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154598 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154602 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154607 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154611 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154615 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154619 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154623 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154627 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154631 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154635 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154639 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154643 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154647 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154652 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154656 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154660 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154664 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154669 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154673 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:45.158314 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154677 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154681 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154685 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154689 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154693 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154697 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154701 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154705 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154709 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154713 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154717 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154720 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154724 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154728 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154733 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154737 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154741 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154746 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154750 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154754 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:45.158837 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154757 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154761 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154765 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154771 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154776 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154782 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154787 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154791 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154795 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154799 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154803 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154808 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154815 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154820 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154825 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154829 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154833 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154837 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154841 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:45.159347 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154845 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:45.159825 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:45.154849 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:45.159825 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.154857 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:45.159825 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.155024 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:46:45.159825 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.157617 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:46:45.159825 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.158769 2573 server.go:1019] "Starting client certificate rotation" Apr 22 18:46:45.159825 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.158867 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:45.159825 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.158903 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:45.186843 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.186817 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:45.189924 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.189905 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:45.210326 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.210305 2573 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:46:45.216642 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.216624 2573 log.go:25] "Validated CRI v1 image API" Apr 22 18:46:45.218452 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.218431 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:46:45.218793 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.218776 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:45.221125 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.221097 2573 fs.go:135] Filesystem UUIDs: map[326a165b-369c-4663-a523-c43ae8c217c2:/dev/nvme0n1p3 49f0225b-324e-4114-9911-31c333cb93ac:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 18:46:45.221186 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.221126 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:46:45.226753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.226639 2573 manager.go:217] Machine: {Timestamp:2026-04-22 18:46:45.225435976 +0000 UTC m=+0.465552071 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100071 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22001a4485852daae34bb1258e9cc4 SystemUUID:ec22001a-4485-852d-aae3-4bb1258e9cc4 BootID:129b3e1d-55a8-4bde-9669-89452dfc80cd Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0f:b6:d0:80:45 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0f:b6:d0:80:45 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:56:ce:92:07:4b:95 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:46:45.226753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.226749 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:46:45.226869 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.226829 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:46:45.228792 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.228764 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:46:45.228926 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.228795 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-84.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:46:45.228969 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.228936 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:46:45.228969 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.228944 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:46:45.228969 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.228961 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:45.229845 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.229835 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:45.231181 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.231158 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:45.231302 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.231293 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:46:45.233647 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.233638 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:46:45.233688 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.233652 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:46:45.233688 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.233664 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:46:45.233688 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.233673 2573 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:46:45.233688 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.233682 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:46:45.234886 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.234875 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:45.234932 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.234893 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:45.235329 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.235313 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-j29kg" Apr 22 18:46:45.237935 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.237922 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:46:45.239299 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.239287 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:46:45.241337 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241325 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:46:45.241391 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241342 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:46:45.241391 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241349 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:46:45.241648 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241496 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:46:45.242446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241760 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:46:45.242446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241778 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:46:45.242446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241789 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:46:45.242446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241817 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:46:45.242446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241831 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:46:45.242446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241841 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:46:45.242446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241854 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:46:45.242446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241877 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:46:45.242446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241911 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:46:45.242446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.241920 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:46:45.242446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.242248 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-j29kg" Apr 22 18:46:45.246239 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.246223 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:46:45.246313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.246274 2573 server.go:1295] "Started kubelet" Apr 22 18:46:45.246395 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.246351 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:46:45.246451 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.246369 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:46:45.246451 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.246428 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:46:45.247135 ip-10-0-133-84 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:46:45.248342 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.248299 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:46:45.248597 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.248584 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:46:45.249600 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.249583 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:45.251823 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.251801 2573 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-84.ec2.internal" not found Apr 22 18:46:45.251913 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.251839 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:45.253221 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.253159 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:45.254273 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.254242 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:46:45.254978 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.254965 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:46:45.255069 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.255007 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:46:45.255069 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.255061 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:46:45.255215 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.255200 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:46:45.255279 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.255216 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:46:45.255279 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:45.255202 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-84.ec2.internal\" not found" Apr 22 18:46:45.257271 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.257248 2573 factory.go:153] Registering CRI-O factory Apr 22 18:46:45.257361 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.257276 2573 factory.go:223] Registration of the crio container factory successfully Apr 22 18:46:45.257361 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.257301 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:45.257361 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.257325 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:46:45.257361 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.257334 2573 factory.go:55] Registering systemd factory Apr 22 18:46:45.257361 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.257343 2573 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:46:45.257361 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.257364 2573 factory.go:103] Registering Raw factory Apr 22 18:46:45.257603 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.257376 2573 manager.go:1196] Started watching for new ooms in manager Apr 22 18:46:45.257859 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.257770 2573 manager.go:319] Starting recovery of all containers Apr 22 18:46:45.258110 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:45.258097 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:46:45.263575 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:45.263400 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-84.ec2.internal\" not found" node="ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.263686 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.263450 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:46:45.266261 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.266233 2573 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-84.ec2.internal" not found Apr 22 18:46:45.269236 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.269052 2573 manager.go:324] Recovery completed Apr 22 18:46:45.273896 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.273881 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:45.275833 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.275815 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-84.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:45.275917 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.275851 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:45.275917 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.275867 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-84.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:45.276362 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.276348 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:46:45.276411 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.276362 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:46:45.276411 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.276378 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:45.278665 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.278653 2573 policy_none.go:49] "None policy: Start" Apr 22 18:46:45.278715 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.278668 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:46:45.278715 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.278677 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:46:45.329132 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.321620 2573 manager.go:341] "Starting Device Plugin manager" Apr 22 18:46:45.329132 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:45.322458 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:46:45.329132 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.322480 2573 server.go:85] "Starting device plugin registration server" Apr 22 18:46:45.329132 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.322749 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:46:45.329132 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.322765 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:46:45.329132 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.322850 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:46:45.329132 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.322970 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:46:45.329132 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.322979 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:46:45.329132 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:45.323373 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:46:45.329132 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:45.323405 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-84.ec2.internal\" not found" Apr 22 18:46:45.329132 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.325377 2573 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-84.ec2.internal" not found Apr 22 18:46:45.358598 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.358580 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:46:45.358697 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.358608 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:46:45.358697 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.358623 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:46:45.358697 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.358630 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:46:45.358697 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:45.358658 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:46:45.360555 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.360517 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:45.423936 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.423851 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:45.424967 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.424951 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-84.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:45.425045 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.424985 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:45.425045 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.425000 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-84.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:45.425045 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.425023 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.432638 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.432623 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.432697 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:45.432650 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-84.ec2.internal\": node \"ip-10-0-133-84.ec2.internal\" not found" Apr 22 18:46:45.458751 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.458719 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-84.ec2.internal"] Apr 22 18:46:45.460917 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.460903 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.460997 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.460905 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.483721 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.483700 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.488315 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.488298 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.498308 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.498287 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:45.498396 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.498291 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:45.556503 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.556477 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eafa0e7552fac0ae9b69b17030b941b2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal\" (UID: \"eafa0e7552fac0ae9b69b17030b941b2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.556623 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.556507 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eafa0e7552fac0ae9b69b17030b941b2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal\" (UID: \"eafa0e7552fac0ae9b69b17030b941b2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.556623 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.556532 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/531ef67c8f8283d29f42b90580fc5209-config\") pod \"kube-apiserver-proxy-ip-10-0-133-84.ec2.internal\" (UID: \"531ef67c8f8283d29f42b90580fc5209\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.657429 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.657404 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eafa0e7552fac0ae9b69b17030b941b2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal\" (UID: \"eafa0e7552fac0ae9b69b17030b941b2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.657429 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.657419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eafa0e7552fac0ae9b69b17030b941b2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal\" (UID: \"eafa0e7552fac0ae9b69b17030b941b2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.657571 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.657447 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eafa0e7552fac0ae9b69b17030b941b2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal\" (UID: \"eafa0e7552fac0ae9b69b17030b941b2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.657571 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.657464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/531ef67c8f8283d29f42b90580fc5209-config\") pod \"kube-apiserver-proxy-ip-10-0-133-84.ec2.internal\" (UID: \"531ef67c8f8283d29f42b90580fc5209\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.657571 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.657488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/531ef67c8f8283d29f42b90580fc5209-config\") pod \"kube-apiserver-proxy-ip-10-0-133-84.ec2.internal\" (UID: \"531ef67c8f8283d29f42b90580fc5209\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.657571 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.657511 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eafa0e7552fac0ae9b69b17030b941b2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal\" (UID: \"eafa0e7552fac0ae9b69b17030b941b2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.801105 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.801038 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" Apr 22 18:46:45.802141 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:45.802125 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-84.ec2.internal" Apr 22 18:46:46.158363 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.158335 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:46:46.159107 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.158479 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:46.159107 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.158512 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:46.159107 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.158517 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:46.233899 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.233878 2573 apiserver.go:52] "Watching apiserver" Apr 22 18:46:46.242281 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.241516 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:46:46.242281 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.241991 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-133-84.ec2.internal","openshift-cluster-node-tuning-operator/tuned-hgrcf","openshift-dns/node-resolver-sklxm","openshift-image-registry/node-ca-fsm87","openshift-multus/multus-rdcjp","openshift-network-diagnostics/network-check-target-cxglm","openshift-network-operator/iptables-alerter-wp4tk","openshift-ovn-kubernetes/ovnkube-node-cr6wp","kube-system/konnectivity-agent-tkwkf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal","openshift-multus/multus-additional-cni-plugins-9qjxs","openshift-multus/network-metrics-daemon-mjbsn"] Apr 22 18:46:46.244582 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.244559 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.244702 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.244601 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:41:45 +0000 UTC" deadline="2027-10-04 16:38:31.421559515 +0000 UTC" Apr 22 18:46:46.244702 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.244666 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12717h51m45.176897137s" Apr 22 18:46:46.245746 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.245726 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sklxm" Apr 22 18:46:46.247551 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.246928 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dn7rk\"" Apr 22 18:46:46.247551 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.247198 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:46.247551 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.247260 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.247551 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.247324 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:46.248284 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.248258 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:46:46.248700 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.248675 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fsm87" Apr 22 18:46:46.248804 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.248789 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:46.248902 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.248867 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:46:46.249706 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.249678 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bnk4n\"" Apr 22 18:46:46.250285 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.250248 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:46:46.250372 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.250308 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:46:46.250372 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.250317 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pkl5k\"" Apr 22 18:46:46.250372 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.250363 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wp4tk" Apr 22 18:46:46.250541 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.250268 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:46:46.250541 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.250268 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:46:46.250629 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.250573 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:46:46.251152 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.251131 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wg8t6\"" Apr 22 18:46:46.251152 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.251145 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:46:46.251324 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.251215 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:46:46.251547 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.251532 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:46:46.251664 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.251649 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.252900 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.252885 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tkwkf" Apr 22 18:46:46.253284 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.253264 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:46.253366 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.253296 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:46.253519 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.253504 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:46.253901 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.253885 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-j7zqv\"" Apr 22 18:46:46.254059 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.254046 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:46:46.254127 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.254117 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kbwq8\"" Apr 22 18:46:46.254500 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.254143 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.254500 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.254491 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:46:46.254624 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.254515 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:46:46.254724 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.254680 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:46:46.254724 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.254709 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:46:46.255402 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.255355 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:46:46.255541 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.255462 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:46:46.255871 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.255852 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mgd5m\"" Apr 22 18:46:46.255950 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.255874 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:46:46.255950 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.255928 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:46:46.256153 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.256134 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.256574 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.256551 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:46:46.256574 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.256564 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rqfjz\"" Apr 22 18:46:46.256682 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.256598 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:46:46.256941 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.256925 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:46:46.257830 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.257811 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:46.257918 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.257877 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:46:46.258449 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.258310 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:46:46.258449 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.258348 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:46:46.258449 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.258422 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zh5qh\"" Apr 22 18:46:46.261299 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261283 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-run-netns\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.261397 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261306 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-run-ovn\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.261397 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261324 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.261397 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wttfn\" (UniqueName: \"kubernetes.io/projected/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-kube-api-access-wttfn\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.261397 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/613626ea-ce6e-4907-a058-fed79d83cc79-host\") pod \"node-ca-fsm87\" (UID: \"613626ea-ce6e-4907-a058-fed79d83cc79\") " pod="openshift-image-registry/node-ca-fsm87" Apr 22 18:46:46.261525 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/67bcb723-894e-4da7-a40d-b8def8103a95-iptables-alerter-script\") pod \"iptables-alerter-wp4tk\" (UID: \"67bcb723-894e-4da7-a40d-b8def8103a95\") " pod="openshift-network-operator/iptables-alerter-wp4tk" Apr 22 18:46:46.261525 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261430 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-multus-socket-dir-parent\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.261525 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261450 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-run-ovn-kubernetes\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.261525 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6ff01b97-8ec7-4056-9191-72e56fe99653-agent-certs\") pod \"konnectivity-agent-tkwkf\" (UID: \"6ff01b97-8ec7-4056-9191-72e56fe99653\") " pod="kube-system/konnectivity-agent-tkwkf" Apr 22 18:46:46.261525 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261485 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-tmp\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.261525 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261499 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88lkc\" (UniqueName: \"kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc\") pod \"network-check-target-cxglm\" (UID: \"54689db5-4b53-4548-b0fc-1a5da6d4dbcc\") " pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:46.261685 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-slash\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.261685 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-ovnkube-config\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.261685 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261591 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/613626ea-ce6e-4907-a058-fed79d83cc79-serviceca\") pod \"node-ca-fsm87\" (UID: \"613626ea-ce6e-4907-a058-fed79d83cc79\") " pod="openshift-image-registry/node-ca-fsm87" Apr 22 18:46:46.261685 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261613 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqdx9\" (UniqueName: \"kubernetes.io/projected/67bcb723-894e-4da7-a40d-b8def8103a95-kube-api-access-zqdx9\") pod \"iptables-alerter-wp4tk\" (UID: \"67bcb723-894e-4da7-a40d-b8def8103a95\") " pod="openshift-network-operator/iptables-alerter-wp4tk" Apr 22 18:46:46.261685 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261630 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-tuned\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.261685 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-run-netns\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.261685 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-var-lib-kubelet\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.261685 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261673 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-env-overrides\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.261919 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261707 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-ovnkube-script-lib\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.261919 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261736 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-modprobe-d\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.261919 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261771 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c935a1fa-6e87-4c17-b437-53842e8bf6b5-tmp-dir\") pod \"node-resolver-sklxm\" (UID: \"c935a1fa-6e87-4c17-b437-53842e8bf6b5\") " pod="openshift-dns/node-resolver-sklxm" Apr 22 18:46:46.261919 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261807 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6583213e-9d3a-4b3f-b477-300fa1ff26c2-cni-binary-copy\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.261919 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261838 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-run-multus-certs\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.261919 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261870 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-run-openvswitch\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.261919 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261895 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d278e3ab-ea06-4b10-bfb7-327499648b8a-system-cni-dir\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.262144 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261921 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d278e3ab-ea06-4b10-bfb7-327499648b8a-cni-binary-copy\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.262144 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261945 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-device-dir\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.262144 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261969 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-lib-modules\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.262144 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.261993 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgq2f\" (UniqueName: \"kubernetes.io/projected/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-kube-api-access-sgq2f\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.262144 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262055 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-var-lib-cni-multus\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.262144 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262087 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-cni-bin\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.262144 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262110 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-system-cni-dir\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.262144 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262134 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-etc-openvswitch\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262160 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d278e3ab-ea06-4b10-bfb7-327499648b8a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262205 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvglp\" (UniqueName: \"kubernetes.io/projected/613626ea-ce6e-4907-a058-fed79d83cc79-kube-api-access-hvglp\") pod \"node-ca-fsm87\" (UID: \"613626ea-ce6e-4907-a058-fed79d83cc79\") " pod="openshift-image-registry/node-ca-fsm87" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262227 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-log-socket\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262256 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-cni-netd\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-socket-dir\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262297 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-systemd\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262326 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-cnibin\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262341 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-var-lib-cni-bin\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262356 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-kubelet\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262370 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d278e3ab-ea06-4b10-bfb7-327499648b8a-cnibin\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262385 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-multus-cni-dir\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262402 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-ovn-node-metrics-cert\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262416 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-run\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262430 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c935a1fa-6e87-4c17-b437-53842e8bf6b5-hosts-file\") pod \"node-resolver-sklxm\" (UID: \"c935a1fa-6e87-4c17-b437-53842e8bf6b5\") " pod="openshift-dns/node-resolver-sklxm" Apr 22 18:46:46.262443 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262445 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6583213e-9d3a-4b3f-b477-300fa1ff26c2-multus-daemon-config\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262459 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6ff01b97-8ec7-4056-9191-72e56fe99653-konnectivity-ca\") pod \"konnectivity-agent-tkwkf\" (UID: \"6ff01b97-8ec7-4056-9191-72e56fe99653\") " pod="kube-system/konnectivity-agent-tkwkf" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p97hz\" (UniqueName: \"kubernetes.io/projected/d278e3ab-ea06-4b10-bfb7-327499648b8a-kube-api-access-p97hz\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262497 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67bcb723-894e-4da7-a40d-b8def8103a95-host-slash\") pod \"iptables-alerter-wp4tk\" (UID: \"67bcb723-894e-4da7-a40d-b8def8103a95\") " pod="openshift-network-operator/iptables-alerter-wp4tk" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262511 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262525 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-registration-dir\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262552 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-host\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262588 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66j84\" (UniqueName: \"kubernetes.io/projected/6583213e-9d3a-4b3f-b477-300fa1ff26c2-kube-api-access-66j84\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-var-lib-kubelet\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-systemd-units\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-run-systemd\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262665 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-node-log\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-sysctl-conf\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-multus-conf-dir\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-sys-fs\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nprj\" (UniqueName: \"kubernetes.io/projected/dcdba7ee-3007-4669-90f0-6d189f580f74-kube-api-access-9nprj\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.263104 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262787 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-sysconfig\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262811 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-os-release\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262835 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-run-k8s-cni-cncf-io\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262876 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-etc-kubernetes\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262901 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-var-lib-openvswitch\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262923 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d278e3ab-ea06-4b10-bfb7-327499648b8a-os-release\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262946 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d278e3ab-ea06-4b10-bfb7-327499648b8a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262972 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2qcn\" (UniqueName: \"kubernetes.io/projected/c935a1fa-6e87-4c17-b437-53842e8bf6b5-kube-api-access-v2qcn\") pod \"node-resolver-sklxm\" (UID: \"c935a1fa-6e87-4c17-b437-53842e8bf6b5\") " pod="openshift-dns/node-resolver-sklxm" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.262995 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-hostroot\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.263022 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d278e3ab-ea06-4b10-bfb7-327499648b8a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.263048 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-etc-selinux\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.263071 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-kubernetes\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.263094 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-sysctl-d\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.263753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.263114 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-sys\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.265575 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.265558 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:46.287294 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.287274 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-z5dxj" Apr 22 18:46:46.296425 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.296409 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-z5dxj" Apr 22 18:46:46.348299 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:46.348273 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod531ef67c8f8283d29f42b90580fc5209.slice/crio-cd77555ff5d49eac9e9eda1408f599348b2c23b730c4f4fc377bf3f9093c4dce WatchSource:0}: Error finding container cd77555ff5d49eac9e9eda1408f599348b2c23b730c4f4fc377bf3f9093c4dce: Status 404 returned error can't find the container with id cd77555ff5d49eac9e9eda1408f599348b2c23b730c4f4fc377bf3f9093c4dce Apr 22 18:46:46.348565 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:46.348542 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeafa0e7552fac0ae9b69b17030b941b2.slice/crio-9bbf912e3100c3312921652be6edada584904bb42cce0db720f4c3e747fad0d9 WatchSource:0}: Error finding container 9bbf912e3100c3312921652be6edada584904bb42cce0db720f4c3e747fad0d9: Status 404 returned error can't find the container with id 9bbf912e3100c3312921652be6edada584904bb42cce0db720f4c3e747fad0d9 Apr 22 18:46:46.354327 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.354308 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:46:46.356188 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.355615 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:46:46.362648 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.362607 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-84.ec2.internal" event={"ID":"531ef67c8f8283d29f42b90580fc5209","Type":"ContainerStarted","Data":"cd77555ff5d49eac9e9eda1408f599348b2c23b730c4f4fc377bf3f9093c4dce"} Apr 22 18:46:46.363319 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67bcb723-894e-4da7-a40d-b8def8103a95-host-slash\") pod \"iptables-alerter-wp4tk\" (UID: \"67bcb723-894e-4da7-a40d-b8def8103a95\") " pod="openshift-network-operator/iptables-alerter-wp4tk" Apr 22 18:46:46.363389 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.363389 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-registration-dir\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.363389 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363353 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67bcb723-894e-4da7-a40d-b8def8103a95-host-slash\") pod \"iptables-alerter-wp4tk\" (UID: \"67bcb723-894e-4da7-a40d-b8def8103a95\") " pod="openshift-network-operator/iptables-alerter-wp4tk" Apr 22 18:46:46.363389 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363369 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-host\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.363550 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66j84\" (UniqueName: \"kubernetes.io/projected/6583213e-9d3a-4b3f-b477-300fa1ff26c2-kube-api-access-66j84\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.363550 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363415 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-registration-dir\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.363550 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363425 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.363550 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-var-lib-kubelet\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.363550 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363470 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-host\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.363550 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363484 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-systemd-units\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.363550 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363506 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-systemd-units\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.363550 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363529 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-run-systemd\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.363550 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363542 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-var-lib-kubelet\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-node-log\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363593 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-run-systemd\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363607 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-node-log\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363613 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" event={"ID":"eafa0e7552fac0ae9b69b17030b941b2","Type":"ContainerStarted","Data":"9bbf912e3100c3312921652be6edada584904bb42cce0db720f4c3e747fad0d9"} Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-sysctl-conf\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-multus-conf-dir\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-sys-fs\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363737 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-sysctl-conf\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363746 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nprj\" (UniqueName: \"kubernetes.io/projected/dcdba7ee-3007-4669-90f0-6d189f580f74-kube-api-access-9nprj\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-multus-conf-dir\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363773 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-sys-fs\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363787 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-sysconfig\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-os-release\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-run-k8s-cni-cncf-io\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363874 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-os-release\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.363883 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-sysconfig\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363914 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-run-k8s-cni-cncf-io\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-etc-kubernetes\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363977 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-var-lib-openvswitch\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.363992 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d278e3ab-ea06-4b10-bfb7-327499648b8a-os-release\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d278e3ab-ea06-4b10-bfb7-327499648b8a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364025 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-etc-kubernetes\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364023 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2qcn\" (UniqueName: \"kubernetes.io/projected/c935a1fa-6e87-4c17-b437-53842e8bf6b5-kube-api-access-v2qcn\") pod \"node-resolver-sklxm\" (UID: \"c935a1fa-6e87-4c17-b437-53842e8bf6b5\") " pod="openshift-dns/node-resolver-sklxm" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-hostroot\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364085 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c4ph\" (UniqueName: \"kubernetes.io/projected/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-kube-api-access-5c4ph\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364096 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-var-lib-openvswitch\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d278e3ab-ea06-4b10-bfb7-327499648b8a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364138 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-etc-selinux\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364144 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-hostroot\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364151 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d278e3ab-ea06-4b10-bfb7-327499648b8a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364176 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-kubernetes\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364188 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d278e3ab-ea06-4b10-bfb7-327499648b8a-os-release\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.364445 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-sysctl-d\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-kubernetes\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364260 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-sys\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-run-netns\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364351 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-run-ovn\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364358 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-run-netns\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364312 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-etc-selinux\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364320 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-sysctl-d\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364353 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-sys\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364650 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-run-ovn\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364745 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wttfn\" (UniqueName: \"kubernetes.io/projected/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-kube-api-access-wttfn\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364778 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/613626ea-ce6e-4907-a058-fed79d83cc79-host\") pod \"node-ca-fsm87\" (UID: \"613626ea-ce6e-4907-a058-fed79d83cc79\") " pod="openshift-image-registry/node-ca-fsm87" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364811 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/67bcb723-894e-4da7-a40d-b8def8103a95-iptables-alerter-script\") pod \"iptables-alerter-wp4tk\" (UID: \"67bcb723-894e-4da7-a40d-b8def8103a95\") " pod="openshift-network-operator/iptables-alerter-wp4tk" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364785 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-multus-socket-dir-parent\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364886 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d278e3ab-ea06-4b10-bfb7-327499648b8a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.365201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/613626ea-ce6e-4907-a058-fed79d83cc79-host\") pod \"node-ca-fsm87\" (UID: \"613626ea-ce6e-4907-a058-fed79d83cc79\") " pod="openshift-image-registry/node-ca-fsm87" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-run-ovn-kubernetes\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.364971 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-multus-socket-dir-parent\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365017 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-run-ovn-kubernetes\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365072 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6ff01b97-8ec7-4056-9191-72e56fe99653-agent-certs\") pod \"konnectivity-agent-tkwkf\" (UID: \"6ff01b97-8ec7-4056-9191-72e56fe99653\") " pod="kube-system/konnectivity-agent-tkwkf" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-tmp\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88lkc\" (UniqueName: \"kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc\") pod \"network-check-target-cxglm\" (UID: \"54689db5-4b53-4548-b0fc-1a5da6d4dbcc\") " pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365284 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-slash\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365312 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-ovnkube-config\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365341 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/613626ea-ce6e-4907-a058-fed79d83cc79-serviceca\") pod \"node-ca-fsm87\" (UID: \"613626ea-ce6e-4907-a058-fed79d83cc79\") " pod="openshift-image-registry/node-ca-fsm87" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqdx9\" (UniqueName: \"kubernetes.io/projected/67bcb723-894e-4da7-a40d-b8def8103a95-kube-api-access-zqdx9\") pod \"iptables-alerter-wp4tk\" (UID: \"67bcb723-894e-4da7-a40d-b8def8103a95\") " pod="openshift-network-operator/iptables-alerter-wp4tk" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-tuned\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-run-netns\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365434 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/67bcb723-894e-4da7-a40d-b8def8103a95-iptables-alerter-script\") pod \"iptables-alerter-wp4tk\" (UID: \"67bcb723-894e-4da7-a40d-b8def8103a95\") " pod="openshift-network-operator/iptables-alerter-wp4tk" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-var-lib-kubelet\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365718 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-env-overrides\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365777 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365789 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-run-netns\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.365960 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365808 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-var-lib-kubelet\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.365877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-slash\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/613626ea-ce6e-4907-a058-fed79d83cc79-serviceca\") pod \"node-ca-fsm87\" (UID: \"613626ea-ce6e-4907-a058-fed79d83cc79\") " pod="openshift-image-registry/node-ca-fsm87" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366311 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-ovnkube-config\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366321 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-ovnkube-script-lib\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366363 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-modprobe-d\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c935a1fa-6e87-4c17-b437-53842e8bf6b5-tmp-dir\") pod \"node-resolver-sklxm\" (UID: \"c935a1fa-6e87-4c17-b437-53842e8bf6b5\") " pod="openshift-dns/node-resolver-sklxm" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6583213e-9d3a-4b3f-b477-300fa1ff26c2-cni-binary-copy\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366463 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-run-multus-certs\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366512 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-run-openvswitch\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d278e3ab-ea06-4b10-bfb7-327499648b8a-system-cni-dir\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d278e3ab-ea06-4b10-bfb7-327499648b8a-cni-binary-copy\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366699 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-device-dir\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-lib-modules\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgq2f\" (UniqueName: \"kubernetes.io/projected/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-kube-api-access-sgq2f\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-var-lib-cni-multus\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.366829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366820 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-cni-bin\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.367547 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366852 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-ovnkube-script-lib\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.367547 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-cni-bin\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.367547 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367325 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6583213e-9d3a-4b3f-b477-300fa1ff26c2-cni-binary-copy\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.367547 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367440 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-modprobe-d\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.367547 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367445 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-var-lib-cni-multus\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.367778 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367575 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-run-openvswitch\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.367778 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367607 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d278e3ab-ea06-4b10-bfb7-327499648b8a-cni-binary-copy\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.367778 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-run-multus-certs\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.367778 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-system-cni-dir\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.367778 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d278e3ab-ea06-4b10-bfb7-327499648b8a-system-cni-dir\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.367778 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367711 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-etc-openvswitch\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.367778 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367740 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c935a1fa-6e87-4c17-b437-53842e8bf6b5-tmp-dir\") pod \"node-resolver-sklxm\" (UID: \"c935a1fa-6e87-4c17-b437-53842e8bf6b5\") " pod="openshift-dns/node-resolver-sklxm" Apr 22 18:46:46.367778 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367748 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-etc-openvswitch\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.366377 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-env-overrides\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d278e3ab-ea06-4b10-bfb7-327499648b8a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367831 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-device-dir\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367882 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvglp\" (UniqueName: \"kubernetes.io/projected/613626ea-ce6e-4907-a058-fed79d83cc79-kube-api-access-hvglp\") pod \"node-ca-fsm87\" (UID: \"613626ea-ce6e-4907-a058-fed79d83cc79\") " pod="openshift-image-registry/node-ca-fsm87" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367888 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-system-cni-dir\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-log-socket\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367967 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-lib-modules\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.367985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-cni-netd\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-socket-dir\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368046 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-log-socket\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368060 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-systemd\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-cnibin\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.368145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368124 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-var-lib-cni-bin\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368156 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-kubelet\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368260 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d278e3ab-ea06-4b10-bfb7-327499648b8a-cnibin\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368275 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dcdba7ee-3007-4669-90f0-6d189f580f74-socket-dir\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-multus-cni-dir\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-cni-netd\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-ovn-node-metrics-cert\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368391 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-run\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c935a1fa-6e87-4c17-b437-53842e8bf6b5-hosts-file\") pod \"node-resolver-sklxm\" (UID: \"c935a1fa-6e87-4c17-b437-53842e8bf6b5\") " pod="openshift-dns/node-resolver-sklxm" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6583213e-9d3a-4b3f-b477-300fa1ff26c2-multus-daemon-config\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368536 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6ff01b97-8ec7-4056-9191-72e56fe99653-konnectivity-ca\") pod \"konnectivity-agent-tkwkf\" (UID: \"6ff01b97-8ec7-4056-9191-72e56fe99653\") " pod="kube-system/konnectivity-agent-tkwkf" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368611 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-run\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.368748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.368612 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p97hz\" (UniqueName: \"kubernetes.io/projected/d278e3ab-ea06-4b10-bfb7-327499648b8a-kube-api-access-p97hz\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.369488 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.369106 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-multus-cni-dir\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.369711 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.369679 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6583213e-9d3a-4b3f-b477-300fa1ff26c2-multus-daemon-config\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.369872 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.369837 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-host-kubelet\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.370135 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.370048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6ff01b97-8ec7-4056-9191-72e56fe99653-konnectivity-ca\") pod \"konnectivity-agent-tkwkf\" (UID: \"6ff01b97-8ec7-4056-9191-72e56fe99653\") " pod="kube-system/konnectivity-agent-tkwkf" Apr 22 18:46:46.370135 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.370142 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6ff01b97-8ec7-4056-9191-72e56fe99653-agent-certs\") pod \"konnectivity-agent-tkwkf\" (UID: \"6ff01b97-8ec7-4056-9191-72e56fe99653\") " pod="kube-system/konnectivity-agent-tkwkf" Apr 22 18:46:46.370327 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.370308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-systemd\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.370327 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.370313 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-etc-tuned\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.370427 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.370359 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c935a1fa-6e87-4c17-b437-53842e8bf6b5-hosts-file\") pod \"node-resolver-sklxm\" (UID: \"c935a1fa-6e87-4c17-b437-53842e8bf6b5\") " pod="openshift-dns/node-resolver-sklxm" Apr 22 18:46:46.370427 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.370383 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-host-var-lib-cni-bin\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.370427 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.370420 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d278e3ab-ea06-4b10-bfb7-327499648b8a-cnibin\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.370561 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.370438 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6583213e-9d3a-4b3f-b477-300fa1ff26c2-cnibin\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.371355 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.371337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d278e3ab-ea06-4b10-bfb7-327499648b8a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.373431 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.372705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-tmp\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.373431 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.373050 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:46.373431 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.373078 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:46.373431 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.373093 2573 projected.go:194] Error preparing data for projected volume kube-api-access-88lkc for pod openshift-network-diagnostics/network-check-target-cxglm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:46.373431 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.373238 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc podName:54689db5-4b53-4548-b0fc-1a5da6d4dbcc nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.873203896 +0000 UTC m=+2.113319995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-88lkc" (UniqueName: "kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc") pod "network-check-target-cxglm" (UID: "54689db5-4b53-4548-b0fc-1a5da6d4dbcc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:46.374050 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.374024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-ovn-node-metrics-cert\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.375080 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.375059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqdx9\" (UniqueName: \"kubernetes.io/projected/67bcb723-894e-4da7-a40d-b8def8103a95-kube-api-access-zqdx9\") pod \"iptables-alerter-wp4tk\" (UID: \"67bcb723-894e-4da7-a40d-b8def8103a95\") " pod="openshift-network-operator/iptables-alerter-wp4tk" Apr 22 18:46:46.375882 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.375849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wttfn\" (UniqueName: \"kubernetes.io/projected/bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278-kube-api-access-wttfn\") pod \"ovnkube-node-cr6wp\" (UID: \"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278\") " pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.375973 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.375859 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nprj\" (UniqueName: \"kubernetes.io/projected/dcdba7ee-3007-4669-90f0-6d189f580f74-kube-api-access-9nprj\") pod \"aws-ebs-csi-driver-node-lhxx6\" (UID: \"dcdba7ee-3007-4669-90f0-6d189f580f74\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.376585 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.376558 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgq2f\" (UniqueName: \"kubernetes.io/projected/fc8f4fb2-59ad-47bf-a8cd-94661b09e98d-kube-api-access-sgq2f\") pod \"tuned-hgrcf\" (UID: \"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d\") " pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.376705 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.376685 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66j84\" (UniqueName: \"kubernetes.io/projected/6583213e-9d3a-4b3f-b477-300fa1ff26c2-kube-api-access-66j84\") pod \"multus-rdcjp\" (UID: \"6583213e-9d3a-4b3f-b477-300fa1ff26c2\") " pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.376798 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.376779 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2qcn\" (UniqueName: \"kubernetes.io/projected/c935a1fa-6e87-4c17-b437-53842e8bf6b5-kube-api-access-v2qcn\") pod \"node-resolver-sklxm\" (UID: \"c935a1fa-6e87-4c17-b437-53842e8bf6b5\") " pod="openshift-dns/node-resolver-sklxm" Apr 22 18:46:46.376894 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.376876 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvglp\" (UniqueName: \"kubernetes.io/projected/613626ea-ce6e-4907-a058-fed79d83cc79-kube-api-access-hvglp\") pod \"node-ca-fsm87\" (UID: \"613626ea-ce6e-4907-a058-fed79d83cc79\") " pod="openshift-image-registry/node-ca-fsm87" Apr 22 18:46:46.376943 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.376921 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p97hz\" (UniqueName: \"kubernetes.io/projected/d278e3ab-ea06-4b10-bfb7-327499648b8a-kube-api-access-p97hz\") pod \"multus-additional-cni-plugins-9qjxs\" (UID: \"d278e3ab-ea06-4b10-bfb7-327499648b8a\") " pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.469708 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.469637 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5c4ph\" (UniqueName: \"kubernetes.io/projected/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-kube-api-access-5c4ph\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:46.469873 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.469855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:46.470002 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.469987 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:46.470060 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.470051 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs podName:c87517bd-8a13-4cb2-bf88-0b3d8c58b67c nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.970036557 +0000 UTC m=+2.210152636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs") pod "network-metrics-daemon-mjbsn" (UID: "c87517bd-8a13-4cb2-bf88-0b3d8c58b67c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:46.478378 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.478356 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c4ph\" (UniqueName: \"kubernetes.io/projected/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-kube-api-access-5c4ph\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:46.567212 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.567191 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" Apr 22 18:46:46.573307 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:46.573280 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc8f4fb2_59ad_47bf_a8cd_94661b09e98d.slice/crio-fa2e0228cec3703852550e5bcf32485212f4983e29d89aa536f669414b20d3cd WatchSource:0}: Error finding container fa2e0228cec3703852550e5bcf32485212f4983e29d89aa536f669414b20d3cd: Status 404 returned error can't find the container with id fa2e0228cec3703852550e5bcf32485212f4983e29d89aa536f669414b20d3cd Apr 22 18:46:46.590151 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.590107 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sklxm" Apr 22 18:46:46.596427 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:46.596408 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc935a1fa_6e87_4c17_b437_53842e8bf6b5.slice/crio-ca233fbf12eb1eca42217237cdde7f754d8c43e86c37a52fb5effa459ebae448 WatchSource:0}: Error finding container ca233fbf12eb1eca42217237cdde7f754d8c43e86c37a52fb5effa459ebae448: Status 404 returned error can't find the container with id ca233fbf12eb1eca42217237cdde7f754d8c43e86c37a52fb5effa459ebae448 Apr 22 18:46:46.605277 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.605248 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rdcjp" Apr 22 18:46:46.609819 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.609797 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fsm87" Apr 22 18:46:46.611307 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:46.611283 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6583213e_9d3a_4b3f_b477_300fa1ff26c2.slice/crio-6bc617519dc2617efecb27048bf1445346a7d5b42379898b01bef045b9fbdf19 WatchSource:0}: Error finding container 6bc617519dc2617efecb27048bf1445346a7d5b42379898b01bef045b9fbdf19: Status 404 returned error can't find the container with id 6bc617519dc2617efecb27048bf1445346a7d5b42379898b01bef045b9fbdf19 Apr 22 18:46:46.616548 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:46.616527 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod613626ea_ce6e_4907_a058_fed79d83cc79.slice/crio-33d6fa0351d29a28063229ecee57be00b03edcd61cdd86bc9d5c43eaf1b9fb73 WatchSource:0}: Error finding container 33d6fa0351d29a28063229ecee57be00b03edcd61cdd86bc9d5c43eaf1b9fb73: Status 404 returned error can't find the container with id 33d6fa0351d29a28063229ecee57be00b03edcd61cdd86bc9d5c43eaf1b9fb73 Apr 22 18:46:46.622666 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.622648 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wp4tk" Apr 22 18:46:46.628353 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.628335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:46:46.630427 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:46.630399 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67bcb723_894e_4da7_a40d_b8def8103a95.slice/crio-10b39b1ff3dafb11c557db8be3eda3cb2d6cd7c161be00570feb1507d66b9870 WatchSource:0}: Error finding container 10b39b1ff3dafb11c557db8be3eda3cb2d6cd7c161be00570feb1507d66b9870: Status 404 returned error can't find the container with id 10b39b1ff3dafb11c557db8be3eda3cb2d6cd7c161be00570feb1507d66b9870 Apr 22 18:46:46.636866 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:46.636848 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbfc7d3e_a6f6_4faa_90ed_6028f3ed1278.slice/crio-29dfb33f31a60bcf8c133e2488e8ebcc41c08648c20e322176310f51f4eef49d WatchSource:0}: Error finding container 29dfb33f31a60bcf8c133e2488e8ebcc41c08648c20e322176310f51f4eef49d: Status 404 returned error can't find the container with id 29dfb33f31a60bcf8c133e2488e8ebcc41c08648c20e322176310f51f4eef49d Apr 22 18:46:46.654991 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.654973 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tkwkf" Apr 22 18:46:46.659656 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.659639 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" Apr 22 18:46:46.660507 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:46.660448 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ff01b97_8ec7_4056_9191_72e56fe99653.slice/crio-ba8b6b05d96f314207d1755bb28b2132ef43849d4133c30076f6268d5f737942 WatchSource:0}: Error finding container ba8b6b05d96f314207d1755bb28b2132ef43849d4133c30076f6268d5f737942: Status 404 returned error can't find the container with id ba8b6b05d96f314207d1755bb28b2132ef43849d4133c30076f6268d5f737942 Apr 22 18:46:46.663300 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.663238 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9qjxs" Apr 22 18:46:46.668663 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:46.668640 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcdba7ee_3007_4669_90f0_6d189f580f74.slice/crio-783b56eed27390865643486eef40a1e2e5c994ec4fe08a1dbc88614bd3994fcc WatchSource:0}: Error finding container 783b56eed27390865643486eef40a1e2e5c994ec4fe08a1dbc88614bd3994fcc: Status 404 returned error can't find the container with id 783b56eed27390865643486eef40a1e2e5c994ec4fe08a1dbc88614bd3994fcc Apr 22 18:46:46.671965 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:46:46.671931 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd278e3ab_ea06_4b10_bfb7_327499648b8a.slice/crio-7516d13b4f0ecfaa15b08bbd593ad4186052fd98176c0e361724bf519c4769a7 WatchSource:0}: Error finding container 7516d13b4f0ecfaa15b08bbd593ad4186052fd98176c0e361724bf519c4769a7: Status 404 returned error can't find the container with id 7516d13b4f0ecfaa15b08bbd593ad4186052fd98176c0e361724bf519c4769a7 Apr 22 18:46:46.873844 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.873260 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88lkc\" (UniqueName: \"kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc\") pod \"network-check-target-cxglm\" (UID: \"54689db5-4b53-4548-b0fc-1a5da6d4dbcc\") " pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:46.873844 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.873419 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:46.873844 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.873439 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:46.873844 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.873454 2573 projected.go:194] Error preparing data for projected volume kube-api-access-88lkc for pod openshift-network-diagnostics/network-check-target-cxglm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:46.873844 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.873507 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc podName:54689db5-4b53-4548-b0fc-1a5da6d4dbcc nodeName:}" failed. No retries permitted until 2026-04-22 18:46:47.873488902 +0000 UTC m=+3.113604986 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-88lkc" (UniqueName: "kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc") pod "network-check-target-cxglm" (UID: "54689db5-4b53-4548-b0fc-1a5da6d4dbcc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:46.974562 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:46.974528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:46.974729 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.974666 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:46.974729 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:46.974723 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs podName:c87517bd-8a13-4cb2-bf88-0b3d8c58b67c nodeName:}" failed. No retries permitted until 2026-04-22 18:46:47.974704304 +0000 UTC m=+3.214820398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs") pod "network-metrics-daemon-mjbsn" (UID: "c87517bd-8a13-4cb2-bf88-0b3d8c58b67c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:47.055322 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.055290 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:47.143464 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.143210 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:47.298064 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.297926 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:46 +0000 UTC" deadline="2027-09-19 07:24:33.619151857 +0000 UTC" Apr 22 18:46:47.298064 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.297962 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12348h37m46.321194267s" Apr 22 18:46:47.351249 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.350829 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:47.370061 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.370022 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" event={"ID":"dcdba7ee-3007-4669-90f0-6d189f580f74","Type":"ContainerStarted","Data":"783b56eed27390865643486eef40a1e2e5c994ec4fe08a1dbc88614bd3994fcc"} Apr 22 18:46:47.375755 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.375723 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" event={"ID":"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278","Type":"ContainerStarted","Data":"29dfb33f31a60bcf8c133e2488e8ebcc41c08648c20e322176310f51f4eef49d"} Apr 22 18:46:47.389651 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.389622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wp4tk" event={"ID":"67bcb723-894e-4da7-a40d-b8def8103a95","Type":"ContainerStarted","Data":"10b39b1ff3dafb11c557db8be3eda3cb2d6cd7c161be00570feb1507d66b9870"} Apr 22 18:46:47.398849 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.398752 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fsm87" event={"ID":"613626ea-ce6e-4907-a058-fed79d83cc79","Type":"ContainerStarted","Data":"33d6fa0351d29a28063229ecee57be00b03edcd61cdd86bc9d5c43eaf1b9fb73"} Apr 22 18:46:47.404382 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.404358 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sklxm" event={"ID":"c935a1fa-6e87-4c17-b437-53842e8bf6b5","Type":"ContainerStarted","Data":"ca233fbf12eb1eca42217237cdde7f754d8c43e86c37a52fb5effa459ebae448"} Apr 22 18:46:47.406820 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.406797 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" event={"ID":"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d","Type":"ContainerStarted","Data":"fa2e0228cec3703852550e5bcf32485212f4983e29d89aa536f669414b20d3cd"} Apr 22 18:46:47.413259 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.413228 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tkwkf" event={"ID":"6ff01b97-8ec7-4056-9191-72e56fe99653","Type":"ContainerStarted","Data":"ba8b6b05d96f314207d1755bb28b2132ef43849d4133c30076f6268d5f737942"} Apr 22 18:46:47.425178 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.425134 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rdcjp" event={"ID":"6583213e-9d3a-4b3f-b477-300fa1ff26c2","Type":"ContainerStarted","Data":"6bc617519dc2617efecb27048bf1445346a7d5b42379898b01bef045b9fbdf19"} Apr 22 18:46:47.441934 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.441903 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9qjxs" event={"ID":"d278e3ab-ea06-4b10-bfb7-327499648b8a","Type":"ContainerStarted","Data":"7516d13b4f0ecfaa15b08bbd593ad4186052fd98176c0e361724bf519c4769a7"} Apr 22 18:46:47.883638 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.883596 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88lkc\" (UniqueName: \"kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc\") pod \"network-check-target-cxglm\" (UID: \"54689db5-4b53-4548-b0fc-1a5da6d4dbcc\") " pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:47.883828 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:47.883786 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:47.883828 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:47.883805 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:47.883828 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:47.883819 2573 projected.go:194] Error preparing data for projected volume kube-api-access-88lkc for pod openshift-network-diagnostics/network-check-target-cxglm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:47.883974 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:47.883877 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc podName:54689db5-4b53-4548-b0fc-1a5da6d4dbcc nodeName:}" failed. No retries permitted until 2026-04-22 18:46:49.883858236 +0000 UTC m=+5.123974319 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-88lkc" (UniqueName: "kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc") pod "network-check-target-cxglm" (UID: "54689db5-4b53-4548-b0fc-1a5da6d4dbcc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:47.984982 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:47.984942 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:47.985159 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:47.985097 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:47.985159 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:47.985179 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs podName:c87517bd-8a13-4cb2-bf88-0b3d8c58b67c nodeName:}" failed. No retries permitted until 2026-04-22 18:46:49.985144264 +0000 UTC m=+5.225260351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs") pod "network-metrics-daemon-mjbsn" (UID: "c87517bd-8a13-4cb2-bf88-0b3d8c58b67c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:48.299192 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:48.299091 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:46 +0000 UTC" deadline="2027-11-13 07:50:52.500189042 +0000 UTC" Apr 22 18:46:48.299192 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:48.299134 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13669h4m4.201061279s" Apr 22 18:46:48.359133 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:48.359102 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:48.359328 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:48.359258 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:46:48.359664 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:48.359645 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:48.359756 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:48.359735 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:46:49.899637 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:49.899588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88lkc\" (UniqueName: \"kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc\") pod \"network-check-target-cxglm\" (UID: \"54689db5-4b53-4548-b0fc-1a5da6d4dbcc\") " pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:49.900023 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:49.899783 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:49.900023 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:49.899801 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:49.900023 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:49.899814 2573 projected.go:194] Error preparing data for projected volume kube-api-access-88lkc for pod openshift-network-diagnostics/network-check-target-cxglm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:49.900023 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:49.899861 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc podName:54689db5-4b53-4548-b0fc-1a5da6d4dbcc nodeName:}" failed. No retries permitted until 2026-04-22 18:46:53.899847817 +0000 UTC m=+9.139963910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-88lkc" (UniqueName: "kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc") pod "network-check-target-cxglm" (UID: "54689db5-4b53-4548-b0fc-1a5da6d4dbcc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:50.000863 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:50.000797 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:50.001017 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:50.000966 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:50.001089 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:50.001030 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs podName:c87517bd-8a13-4cb2-bf88-0b3d8c58b67c nodeName:}" failed. No retries permitted until 2026-04-22 18:46:54.001009955 +0000 UTC m=+9.241126071 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs") pod "network-metrics-daemon-mjbsn" (UID: "c87517bd-8a13-4cb2-bf88-0b3d8c58b67c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:50.359363 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:50.359281 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:50.359363 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:50.359300 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:50.359540 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:50.359416 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:46:50.359593 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:50.359560 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:46:52.359808 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:52.359775 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:52.360266 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:52.359917 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:46:52.360363 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:52.360342 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:52.360465 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:52.360438 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:46:53.935127 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:53.935085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88lkc\" (UniqueName: \"kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc\") pod \"network-check-target-cxglm\" (UID: \"54689db5-4b53-4548-b0fc-1a5da6d4dbcc\") " pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:53.935574 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:53.935286 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:53.935574 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:53.935309 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:53.935574 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:53.935323 2573 projected.go:194] Error preparing data for projected volume kube-api-access-88lkc for pod openshift-network-diagnostics/network-check-target-cxglm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:53.935574 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:53.935395 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc podName:54689db5-4b53-4548-b0fc-1a5da6d4dbcc nodeName:}" failed. No retries permitted until 2026-04-22 18:47:01.935374803 +0000 UTC m=+17.175490906 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-88lkc" (UniqueName: "kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc") pod "network-check-target-cxglm" (UID: "54689db5-4b53-4548-b0fc-1a5da6d4dbcc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:54.036382 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.036340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:54.036541 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:54.036473 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:54.036613 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:54.036547 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs podName:c87517bd-8a13-4cb2-bf88-0b3d8c58b67c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.036528695 +0000 UTC m=+17.276644779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs") pod "network-metrics-daemon-mjbsn" (UID: "c87517bd-8a13-4cb2-bf88-0b3d8c58b67c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:54.316942 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.316155 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-d4kbs"] Apr 22 18:46:54.320655 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.320360 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:54.320655 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:54.320443 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:46:54.359159 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.359123 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:54.359339 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:54.359257 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:46:54.359601 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.359583 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:54.359706 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:54.359687 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:46:54.439613 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.439570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0c9de063-c082-4665-b7f0-97a86598f6a1-dbus\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:54.439781 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.439639 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0c9de063-c082-4665-b7f0-97a86598f6a1-kubelet-config\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:54.439781 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.439697 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:54.540628 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.540591 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0c9de063-c082-4665-b7f0-97a86598f6a1-kubelet-config\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:54.540767 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.540653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:54.540767 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.540690 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0c9de063-c082-4665-b7f0-97a86598f6a1-dbus\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:54.540891 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.540835 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0c9de063-c082-4665-b7f0-97a86598f6a1-dbus\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:54.540891 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:54.540884 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0c9de063-c082-4665-b7f0-97a86598f6a1-kubelet-config\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:54.540969 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:54.540955 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:54.541009 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:54.540999 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret podName:0c9de063-c082-4665-b7f0-97a86598f6a1 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:55.040986993 +0000 UTC m=+10.281103074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret") pod "global-pull-secret-syncer-d4kbs" (UID: "0c9de063-c082-4665-b7f0-97a86598f6a1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:55.047235 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:55.047140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:55.047614 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:55.047305 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:55.047614 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:55.047363 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret podName:0c9de063-c082-4665-b7f0-97a86598f6a1 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:56.047345176 +0000 UTC m=+11.287461271 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret") pod "global-pull-secret-syncer-d4kbs" (UID: "0c9de063-c082-4665-b7f0-97a86598f6a1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:56.055226 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:56.055197 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:56.055677 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:56.055353 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:56.055677 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:56.055420 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret podName:0c9de063-c082-4665-b7f0-97a86598f6a1 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:58.055401293 +0000 UTC m=+13.295517373 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret") pod "global-pull-secret-syncer-d4kbs" (UID: "0c9de063-c082-4665-b7f0-97a86598f6a1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:56.359241 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:56.359148 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:56.359241 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:56.359225 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:56.359403 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:56.359251 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:56.359403 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:56.359321 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:46:56.359403 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:56.359373 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:46:56.359492 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:56.359439 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:46:58.069499 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:58.069456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:58.069921 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:58.069633 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:58.069921 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:58.069699 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret podName:0c9de063-c082-4665-b7f0-97a86598f6a1 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.069683444 +0000 UTC m=+17.309799524 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret") pod "global-pull-secret-syncer-d4kbs" (UID: "0c9de063-c082-4665-b7f0-97a86598f6a1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:58.359309 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:58.359225 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:46:58.359309 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:58.359255 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:46:58.359512 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:58.359338 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:46:58.359512 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:46:58.359368 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:46:58.359512 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:58.359451 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:46:58.359629 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:46:58.359522 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:47:00.359373 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:00.359335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:00.359373 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:00.359359 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:00.359866 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:00.359531 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:00.359866 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:00.359539 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:47:00.359866 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:00.359614 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:47:00.359866 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:00.359670 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:47:02.001595 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:02.001560 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88lkc\" (UniqueName: \"kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc\") pod \"network-check-target-cxglm\" (UID: \"54689db5-4b53-4548-b0fc-1a5da6d4dbcc\") " pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:02.002057 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:02.001738 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:02.002057 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:02.001769 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:02.002057 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:02.001784 2573 projected.go:194] Error preparing data for projected volume kube-api-access-88lkc for pod openshift-network-diagnostics/network-check-target-cxglm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:02.002057 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:02.001841 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc podName:54689db5-4b53-4548-b0fc-1a5da6d4dbcc nodeName:}" failed. No retries permitted until 2026-04-22 18:47:18.001823296 +0000 UTC m=+33.241939394 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-88lkc" (UniqueName: "kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc") pod "network-check-target-cxglm" (UID: "54689db5-4b53-4548-b0fc-1a5da6d4dbcc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:02.102382 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:02.102344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:02.102382 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:02.102401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:02.102626 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:02.102448 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:02.102626 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:02.102505 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:02.102626 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:02.102529 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret podName:0c9de063-c082-4665-b7f0-97a86598f6a1 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:10.102505716 +0000 UTC m=+25.342621818 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret") pod "global-pull-secret-syncer-d4kbs" (UID: "0c9de063-c082-4665-b7f0-97a86598f6a1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:02.102626 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:02.102550 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs podName:c87517bd-8a13-4cb2-bf88-0b3d8c58b67c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:18.102538309 +0000 UTC m=+33.342654388 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs") pod "network-metrics-daemon-mjbsn" (UID: "c87517bd-8a13-4cb2-bf88-0b3d8c58b67c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:02.358893 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:02.358807 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:02.359061 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:02.358933 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:47:02.359061 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:02.358952 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:02.359061 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:02.358969 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:02.359061 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:02.359051 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:47:02.359294 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:02.359133 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:47:04.359670 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:04.359641 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:04.359670 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:04.359666 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:04.360093 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:04.359712 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:04.360093 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:04.359799 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:47:04.360093 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:04.359918 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:47:04.360093 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:04.359985 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:47:05.476814 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.476396 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fsm87" event={"ID":"613626ea-ce6e-4907-a058-fed79d83cc79","Type":"ContainerStarted","Data":"1d4cea0ca918a58e08f73da4138945db619e53ed04ac2f47c7c11e74905e872c"} Apr 22 18:47:05.477808 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.477786 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sklxm" event={"ID":"c935a1fa-6e87-4c17-b437-53842e8bf6b5","Type":"ContainerStarted","Data":"655ec1d84681cfda066f7af84b4c89fa0c2577ac2e0f0540f75e2a9d0b1d13cb"} Apr 22 18:47:05.479030 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.479009 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" event={"ID":"fc8f4fb2-59ad-47bf-a8cd-94661b09e98d","Type":"ContainerStarted","Data":"ca94d9dab65b48b1b75359a9b2bc2ff6b268ea972d01dfac164dfb0440b8c684"} Apr 22 18:47:05.480226 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.480203 2573 generic.go:358] "Generic (PLEG): container finished" podID="eafa0e7552fac0ae9b69b17030b941b2" containerID="b422da1c2a3343570eb4de8b0483db184996d02930b2fe1f7bae97589a810266" exitCode=0 Apr 22 18:47:05.480307 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.480239 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" event={"ID":"eafa0e7552fac0ae9b69b17030b941b2","Type":"ContainerDied","Data":"b422da1c2a3343570eb4de8b0483db184996d02930b2fe1f7bae97589a810266"} Apr 22 18:47:05.481624 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.481603 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tkwkf" event={"ID":"6ff01b97-8ec7-4056-9191-72e56fe99653","Type":"ContainerStarted","Data":"6228fa010e8c4c5a2a9859e7934fa2c791da45a9ace0af4d1bd8d039c8dd1e2b"} Apr 22 18:47:05.482838 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.482819 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rdcjp" event={"ID":"6583213e-9d3a-4b3f-b477-300fa1ff26c2","Type":"ContainerStarted","Data":"743b3159a89aefa804294bbf5ad1a2fb87d4f5913433644b7589d382ee2ad439"} Apr 22 18:47:05.484006 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.483977 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-84.ec2.internal" event={"ID":"531ef67c8f8283d29f42b90580fc5209","Type":"ContainerStarted","Data":"2556d89466224c90902f8a73db05b92b8dffb78e99078cf1c0bbfd917b0b3a71"} Apr 22 18:47:05.485333 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.485313 2573 generic.go:358] "Generic (PLEG): container finished" podID="d278e3ab-ea06-4b10-bfb7-327499648b8a" containerID="c4f894207d1adc306927cc259693cc32ba3d006452b8bd110594c856c7260dec" exitCode=0 Apr 22 18:47:05.485420 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.485341 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9qjxs" event={"ID":"d278e3ab-ea06-4b10-bfb7-327499648b8a","Type":"ContainerDied","Data":"c4f894207d1adc306927cc259693cc32ba3d006452b8bd110594c856c7260dec"} Apr 22 18:47:05.488269 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.486977 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" event={"ID":"dcdba7ee-3007-4669-90f0-6d189f580f74","Type":"ContainerStarted","Data":"f2ab9f01693aba51400b73372a618ff413de5a051b0fd0ed5c64a4b43a536d97"} Apr 22 18:47:05.491526 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.491510 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 18:47:05.492105 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.492084 2573 generic.go:358] "Generic (PLEG): container finished" podID="bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278" containerID="6a48f21badb319e5d2a1a675347dede162fe8016e36e8e103459d7fa1c252a12" exitCode=1 Apr 22 18:47:05.492194 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.492115 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" event={"ID":"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278","Type":"ContainerStarted","Data":"723a0c2ea5064435d4e3fabb4050261105beb1b8e95e9de18db866bc23e0b2fc"} Apr 22 18:47:05.492194 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.492129 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" event={"ID":"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278","Type":"ContainerStarted","Data":"f45d7bcfb77a0e0df18667bd77ac0820829da4ac0f5bf09c53f075fc0e7b2d23"} Apr 22 18:47:05.492194 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.492139 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" event={"ID":"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278","Type":"ContainerStarted","Data":"d5cc3edf6ebeea7e867e99071f93e98906613a63d3e6de5102001836c86cab61"} Apr 22 18:47:05.492194 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.492149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" event={"ID":"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278","Type":"ContainerStarted","Data":"afbda08c54e899dbf37d329350c4ef74e0554afdecaaff59267549976c75bd3f"} Apr 22 18:47:05.492194 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.492157 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" event={"ID":"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278","Type":"ContainerDied","Data":"6a48f21badb319e5d2a1a675347dede162fe8016e36e8e103459d7fa1c252a12"} Apr 22 18:47:05.492345 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.492209 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" event={"ID":"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278","Type":"ContainerStarted","Data":"7957235d9f421ddaf251576ef5d861b4fb2742c77af0fbc64784364371236e7b"} Apr 22 18:47:05.494313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.494281 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fsm87" podStartSLOduration=2.579476736 podStartE2EDuration="20.494272037s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.617905911 +0000 UTC m=+1.858021994" lastFinishedPulling="2026-04-22 18:47:04.532701215 +0000 UTC m=+19.772817295" observedRunningTime="2026-04-22 18:47:05.494051831 +0000 UTC m=+20.734167933" watchObservedRunningTime="2026-04-22 18:47:05.494272037 +0000 UTC m=+20.734388138" Apr 22 18:47:05.518468 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.518427 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-84.ec2.internal" podStartSLOduration=20.518412071 podStartE2EDuration="20.518412071s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:47:05.517969375 +0000 UTC m=+20.758085478" watchObservedRunningTime="2026-04-22 18:47:05.518412071 +0000 UTC m=+20.758528174" Apr 22 18:47:05.532439 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.532400 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hgrcf" podStartSLOduration=2.561781822 podStartE2EDuration="20.53238647s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.574876418 +0000 UTC m=+1.814992498" lastFinishedPulling="2026-04-22 18:47:04.545481066 +0000 UTC m=+19.785597146" observedRunningTime="2026-04-22 18:47:05.531928885 +0000 UTC m=+20.772044989" watchObservedRunningTime="2026-04-22 18:47:05.53238647 +0000 UTC m=+20.772502572" Apr 22 18:47:05.544908 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.544683 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tkwkf" podStartSLOduration=2.674863658 podStartE2EDuration="20.544669268s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.662897104 +0000 UTC m=+1.903013184" lastFinishedPulling="2026-04-22 18:47:04.532702715 +0000 UTC m=+19.772818794" observedRunningTime="2026-04-22 18:47:05.544385019 +0000 UTC m=+20.784501121" watchObservedRunningTime="2026-04-22 18:47:05.544669268 +0000 UTC m=+20.784785371" Apr 22 18:47:05.557875 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.557834 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sklxm" podStartSLOduration=2.622900521 podStartE2EDuration="20.557821034s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.597803868 +0000 UTC m=+1.837919948" lastFinishedPulling="2026-04-22 18:47:04.532724364 +0000 UTC m=+19.772840461" observedRunningTime="2026-04-22 18:47:05.557552027 +0000 UTC m=+20.797668128" watchObservedRunningTime="2026-04-22 18:47:05.557821034 +0000 UTC m=+20.797937135" Apr 22 18:47:05.600243 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.600206 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rdcjp" podStartSLOduration=2.634486447 podStartE2EDuration="20.60019293s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.613543816 +0000 UTC m=+1.853659896" lastFinishedPulling="2026-04-22 18:47:04.5792503 +0000 UTC m=+19.819366379" observedRunningTime="2026-04-22 18:47:05.578968081 +0000 UTC m=+20.819084185" watchObservedRunningTime="2026-04-22 18:47:05.60019293 +0000 UTC m=+20.840309032" Apr 22 18:47:05.838879 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.838795 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tkwkf" Apr 22 18:47:05.839441 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:05.839422 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tkwkf" Apr 22 18:47:06.339698 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:06.339674 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:47:06.359039 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:06.359013 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:06.359146 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:06.359043 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:06.359146 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:06.359058 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:06.359248 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:06.359159 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:47:06.359248 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:06.359233 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:47:06.359422 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:06.359390 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:47:06.496457 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:06.496382 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" event={"ID":"dcdba7ee-3007-4669-90f0-6d189f580f74","Type":"ContainerStarted","Data":"09f59e4f70e8a02f0a2899d5be4e816d426fa7eb1a2a329ea2e1bac170c5efc4"} Apr 22 18:47:06.498210 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:06.498187 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wp4tk" event={"ID":"67bcb723-894e-4da7-a40d-b8def8103a95","Type":"ContainerStarted","Data":"a803bcd72e1e8d761ba3d3ca06ce983d8d0fc063eae394eb81f73e83d068e485"} Apr 22 18:47:06.499919 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:06.499894 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" event={"ID":"eafa0e7552fac0ae9b69b17030b941b2","Type":"ContainerStarted","Data":"14dd9e7c92eac9950240e9646e44c47e0a8f5eac7a5f36bc60e6ce2f2ec8b773"} Apr 22 18:47:06.501006 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:06.500978 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tkwkf" Apr 22 18:47:06.501608 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:06.501592 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tkwkf" Apr 22 18:47:06.511763 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:06.511727 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wp4tk" podStartSLOduration=3.600276057 podStartE2EDuration="21.511716451s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.63323832 +0000 UTC m=+1.873354399" lastFinishedPulling="2026-04-22 18:47:04.544678698 +0000 UTC m=+19.784794793" observedRunningTime="2026-04-22 18:47:06.511205376 +0000 UTC m=+21.751321477" watchObservedRunningTime="2026-04-22 18:47:06.511716451 +0000 UTC m=+21.751832552" Apr 22 18:47:06.538727 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:06.538666 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-84.ec2.internal" podStartSLOduration=21.538648589 podStartE2EDuration="21.538648589s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:47:06.524505075 +0000 UTC m=+21.764621189" watchObservedRunningTime="2026-04-22 18:47:06.538648589 +0000 UTC m=+21.778764694" Apr 22 18:47:07.338140 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:07.338007 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:47:06.339693338Z","UUID":"36036916-bc4d-496d-a664-23e726045b84","Handler":null,"Name":"","Endpoint":""} Apr 22 18:47:07.343832 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:07.343805 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:47:07.343981 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:07.343841 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:47:07.504489 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:07.504399 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" event={"ID":"dcdba7ee-3007-4669-90f0-6d189f580f74","Type":"ContainerStarted","Data":"05cf1aca519ec3daf517e362cf10bee47b9748ef401ee58698e81db34820c439"} Apr 22 18:47:07.507473 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:07.507448 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 18:47:07.508234 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:07.508183 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" event={"ID":"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278","Type":"ContainerStarted","Data":"52d278a5457e7e8707e739ecb50e1ea5939079b7cdb49b7f0fee6a7875ea94f8"} Apr 22 18:47:07.520692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:07.520651 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhxx6" podStartSLOduration=2.031582748 podStartE2EDuration="22.520637287s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.670895737 +0000 UTC m=+1.911011820" lastFinishedPulling="2026-04-22 18:47:07.159950266 +0000 UTC m=+22.400066359" observedRunningTime="2026-04-22 18:47:07.520529104 +0000 UTC m=+22.760645206" watchObservedRunningTime="2026-04-22 18:47:07.520637287 +0000 UTC m=+22.760753390" Apr 22 18:47:08.359049 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:08.359012 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:08.359049 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:08.359040 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:08.359340 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:08.359139 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:47:08.359340 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:08.359180 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:08.359340 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:08.359278 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:47:08.359506 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:08.359367 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:47:10.162265 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:10.161958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:10.162952 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:10.162090 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:10.162952 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:10.162355 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret podName:0c9de063-c082-4665-b7f0-97a86598f6a1 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:26.162339608 +0000 UTC m=+41.402455688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret") pod "global-pull-secret-syncer-d4kbs" (UID: "0c9de063-c082-4665-b7f0-97a86598f6a1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:10.358853 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:10.358826 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:10.358853 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:10.358858 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:10.359054 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:10.358931 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:47:10.359054 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:10.358969 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:47:10.359054 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:10.358998 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:10.359054 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:10.359041 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:47:10.514370 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:10.514264 2573 generic.go:358] "Generic (PLEG): container finished" podID="d278e3ab-ea06-4b10-bfb7-327499648b8a" containerID="8f92938e111f02cc6a84417634b2fbc4eb10ebbe2e6d4f5bb134da4ada6cf9e6" exitCode=0 Apr 22 18:47:10.514370 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:10.514337 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9qjxs" event={"ID":"d278e3ab-ea06-4b10-bfb7-327499648b8a","Type":"ContainerDied","Data":"8f92938e111f02cc6a84417634b2fbc4eb10ebbe2e6d4f5bb134da4ada6cf9e6"} Apr 22 18:47:10.518200 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:10.518182 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 18:47:10.518498 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:10.518473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" event={"ID":"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278","Type":"ContainerStarted","Data":"b8009befaa9806261c514bac5d46b7aee827fd56c31a7c686daba5d7acbb8a4e"} Apr 22 18:47:10.518888 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:10.518845 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:47:10.518974 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:10.518957 2573 scope.go:117] "RemoveContainer" containerID="6a48f21badb319e5d2a1a675347dede162fe8016e36e8e103459d7fa1c252a12" Apr 22 18:47:10.534537 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:10.534515 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:47:11.497735 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.497671 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d4kbs"] Apr 22 18:47:11.498066 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.497787 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:11.498066 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:11.497897 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:47:11.504216 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.500870 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cxglm"] Apr 22 18:47:11.504216 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.503659 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:11.504216 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:11.503799 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:47:11.505387 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.505271 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mjbsn"] Apr 22 18:47:11.505767 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.505741 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:11.505861 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:11.505840 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:47:11.522123 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.522103 2573 generic.go:358] "Generic (PLEG): container finished" podID="d278e3ab-ea06-4b10-bfb7-327499648b8a" containerID="80482fae9f174fcc2b035271cb16b1601739beb371dc620f706217815ed65982" exitCode=0 Apr 22 18:47:11.522231 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.522203 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9qjxs" event={"ID":"d278e3ab-ea06-4b10-bfb7-327499648b8a","Type":"ContainerDied","Data":"80482fae9f174fcc2b035271cb16b1601739beb371dc620f706217815ed65982"} Apr 22 18:47:11.525873 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.525857 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 18:47:11.526264 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.526242 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" event={"ID":"bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278","Type":"ContainerStarted","Data":"9481c8e92158e801f697d46dde00dd1251e4488a33aa0f8a6571959e40e352b0"} Apr 22 18:47:11.526414 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.526402 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:11.526693 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.526676 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:47:11.541862 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.541842 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:47:11.565534 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:11.565496 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" podStartSLOduration=8.555452594 podStartE2EDuration="26.565485377s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.638352326 +0000 UTC m=+1.878468406" lastFinishedPulling="2026-04-22 18:47:04.648385107 +0000 UTC m=+19.888501189" observedRunningTime="2026-04-22 18:47:11.564075956 +0000 UTC m=+26.804192070" watchObservedRunningTime="2026-04-22 18:47:11.565485377 +0000 UTC m=+26.805601479" Apr 22 18:47:12.529849 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:12.529812 2573 generic.go:358] "Generic (PLEG): container finished" podID="d278e3ab-ea06-4b10-bfb7-327499648b8a" containerID="9f7255a0f394b01c20cac9de135a97f34cca254e0718f29c71dc55327fa4d052" exitCode=0 Apr 22 18:47:12.530215 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:12.529906 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9qjxs" event={"ID":"d278e3ab-ea06-4b10-bfb7-327499648b8a","Type":"ContainerDied","Data":"9f7255a0f394b01c20cac9de135a97f34cca254e0718f29c71dc55327fa4d052"} Apr 22 18:47:12.530215 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:12.530128 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:13.359534 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:13.359501 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:13.359715 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:13.359501 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:13.359715 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:13.359628 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:47:13.359816 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:13.359719 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:47:13.359816 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:13.359767 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:13.359916 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:13.359852 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:47:13.531995 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:13.531967 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:15.361466 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:15.361304 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:15.361845 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:15.361384 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:15.361845 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:15.361541 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:47:15.361845 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:15.361624 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:47:15.361845 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:15.361414 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:15.361845 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:15.361727 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:47:15.612268 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:15.612160 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:47:15.613309 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:15.612420 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:15.632038 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:15.631986 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" podUID="bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 18:47:15.642382 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:15.642350 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" podUID="bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 18:47:17.359094 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.359063 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:17.359602 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.359063 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:17.359602 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.359057 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:17.359602 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:17.359211 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:47:17.359602 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:17.359343 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d4kbs" podUID="0c9de063-c082-4665-b7f0-97a86598f6a1" Apr 22 18:47:17.359602 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:17.359429 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxglm" podUID="54689db5-4b53-4548-b0fc-1a5da6d4dbcc" Apr 22 18:47:17.557479 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.557412 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-84.ec2.internal" event="NodeReady" Apr 22 18:47:17.557620 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.557531 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:47:17.596669 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.596611 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-q62qx"] Apr 22 18:47:17.633448 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.633411 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sd2xm"] Apr 22 18:47:17.633601 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.633523 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:17.635972 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.635928 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:47:17.636256 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.636238 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:47:17.636256 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.636251 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zg5rc\"" Apr 22 18:47:17.659640 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.659605 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-q62qx"] Apr 22 18:47:17.659737 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.659650 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sd2xm"] Apr 22 18:47:17.659737 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.659680 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:47:17.662779 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.662114 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:47:17.662779 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.662147 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rxjsb\"" Apr 22 18:47:17.662779 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.662221 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:47:17.662779 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.662413 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:47:17.719240 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.719202 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/39993c75-c30d-49df-8bc3-5e7250217350-tmp-dir\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:17.719359 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.719258 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39993c75-c30d-49df-8bc3-5e7250217350-config-volume\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:17.719359 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.719302 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59sjj\" (UniqueName: \"kubernetes.io/projected/39993c75-c30d-49df-8bc3-5e7250217350-kube-api-access-59sjj\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:17.719359 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.719335 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:17.820265 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.820192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39993c75-c30d-49df-8bc3-5e7250217350-config-volume\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:17.820409 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.820265 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dls86\" (UniqueName: \"kubernetes.io/projected/2a601b18-e341-409a-8702-534bf559976c-kube-api-access-dls86\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:47:17.820409 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.820309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59sjj\" (UniqueName: \"kubernetes.io/projected/39993c75-c30d-49df-8bc3-5e7250217350-kube-api-access-59sjj\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:17.820409 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.820342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:47:17.820409 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.820378 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:17.820579 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.820412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/39993c75-c30d-49df-8bc3-5e7250217350-tmp-dir\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:17.820720 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.820692 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/39993c75-c30d-49df-8bc3-5e7250217350-tmp-dir\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:17.820720 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.820705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39993c75-c30d-49df-8bc3-5e7250217350-config-volume\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:17.820831 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:17.820814 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:17.820894 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:17.820863 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls podName:39993c75-c30d-49df-8bc3-5e7250217350 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:18.320849487 +0000 UTC m=+33.560965566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls") pod "dns-default-q62qx" (UID: "39993c75-c30d-49df-8bc3-5e7250217350") : secret "dns-default-metrics-tls" not found Apr 22 18:47:17.833105 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.833086 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59sjj\" (UniqueName: \"kubernetes.io/projected/39993c75-c30d-49df-8bc3-5e7250217350-kube-api-access-59sjj\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:17.921270 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.921225 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:47:17.921436 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.921329 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dls86\" (UniqueName: \"kubernetes.io/projected/2a601b18-e341-409a-8702-534bf559976c-kube-api-access-dls86\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:47:17.921436 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:17.921390 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:17.921551 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:17.921479 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert podName:2a601b18-e341-409a-8702-534bf559976c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:18.421458324 +0000 UTC m=+33.661574403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert") pod "ingress-canary-sd2xm" (UID: "2a601b18-e341-409a-8702-534bf559976c") : secret "canary-serving-cert" not found Apr 22 18:47:17.930055 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:17.930029 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dls86\" (UniqueName: \"kubernetes.io/projected/2a601b18-e341-409a-8702-534bf559976c-kube-api-access-dls86\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:47:18.022230 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:18.022196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88lkc\" (UniqueName: \"kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc\") pod \"network-check-target-cxglm\" (UID: \"54689db5-4b53-4548-b0fc-1a5da6d4dbcc\") " pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:18.022388 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:18.022361 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:18.022388 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:18.022382 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:18.022469 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:18.022394 2573 projected.go:194] Error preparing data for projected volume kube-api-access-88lkc for pod openshift-network-diagnostics/network-check-target-cxglm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:18.022469 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:18.022456 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc podName:54689db5-4b53-4548-b0fc-1a5da6d4dbcc nodeName:}" failed. No retries permitted until 2026-04-22 18:47:50.02243564 +0000 UTC m=+65.262551750 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-88lkc" (UniqueName: "kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc") pod "network-check-target-cxglm" (UID: "54689db5-4b53-4548-b0fc-1a5da6d4dbcc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:18.122750 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:18.122720 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:18.122877 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:18.122833 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:18.122924 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:18.122893 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs podName:c87517bd-8a13-4cb2-bf88-0b3d8c58b67c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:50.122877044 +0000 UTC m=+65.362993125 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs") pod "network-metrics-daemon-mjbsn" (UID: "c87517bd-8a13-4cb2-bf88-0b3d8c58b67c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:18.325012 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:18.324971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:18.325185 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:18.325123 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:18.325240 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:18.325233 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls podName:39993c75-c30d-49df-8bc3-5e7250217350 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:19.325213901 +0000 UTC m=+34.565329982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls") pod "dns-default-q62qx" (UID: "39993c75-c30d-49df-8bc3-5e7250217350") : secret "dns-default-metrics-tls" not found Apr 22 18:47:18.425651 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:18.425588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:47:18.426045 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:18.425709 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:18.426045 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:18.425783 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert podName:2a601b18-e341-409a-8702-534bf559976c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:19.425749445 +0000 UTC m=+34.665865542 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert") pod "ingress-canary-sd2xm" (UID: "2a601b18-e341-409a-8702-534bf559976c") : secret "canary-serving-cert" not found Apr 22 18:47:19.331037 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.330997 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:19.331254 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:19.331154 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:19.331254 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:19.331236 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls podName:39993c75-c30d-49df-8bc3-5e7250217350 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:21.331218519 +0000 UTC m=+36.571334604 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls") pod "dns-default-q62qx" (UID: "39993c75-c30d-49df-8bc3-5e7250217350") : secret "dns-default-metrics-tls" not found Apr 22 18:47:19.359463 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.359425 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:19.359608 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.359433 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:19.359661 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.359433 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:19.362350 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.362330 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:19.362350 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.362340 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-m94d4\"" Apr 22 18:47:19.363318 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.363298 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5ggfs\"" Apr 22 18:47:19.363550 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.363524 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:47:19.363626 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.363556 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:19.363626 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.363603 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:19.431789 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.431770 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:47:19.432127 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:19.431910 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:19.432127 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:19.431963 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert podName:2a601b18-e341-409a-8702-534bf559976c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:21.431946253 +0000 UTC m=+36.672062348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert") pod "ingress-canary-sd2xm" (UID: "2a601b18-e341-409a-8702-534bf559976c") : secret "canary-serving-cert" not found Apr 22 18:47:19.546608 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.546580 2573 generic.go:358] "Generic (PLEG): container finished" podID="d278e3ab-ea06-4b10-bfb7-327499648b8a" containerID="2e66c8683ea08366142ea2cef7d1a38b85baf6ba926031dc57fbc45400f11b3c" exitCode=0 Apr 22 18:47:19.546700 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:19.546633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9qjxs" event={"ID":"d278e3ab-ea06-4b10-bfb7-327499648b8a","Type":"ContainerDied","Data":"2e66c8683ea08366142ea2cef7d1a38b85baf6ba926031dc57fbc45400f11b3c"} Apr 22 18:47:20.551241 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:20.551201 2573 generic.go:358] "Generic (PLEG): container finished" podID="d278e3ab-ea06-4b10-bfb7-327499648b8a" containerID="153214732daddd5678a5224ea2855b4e1910a25ce6ea0b2d4c8551c2d6b24981" exitCode=0 Apr 22 18:47:20.551653 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:20.551266 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9qjxs" event={"ID":"d278e3ab-ea06-4b10-bfb7-327499648b8a","Type":"ContainerDied","Data":"153214732daddd5678a5224ea2855b4e1910a25ce6ea0b2d4c8551c2d6b24981"} Apr 22 18:47:21.346624 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:21.346552 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:21.346761 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:21.346662 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:21.346761 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:21.346715 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls podName:39993c75-c30d-49df-8bc3-5e7250217350 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:25.346700225 +0000 UTC m=+40.586816309 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls") pod "dns-default-q62qx" (UID: "39993c75-c30d-49df-8bc3-5e7250217350") : secret "dns-default-metrics-tls" not found Apr 22 18:47:21.446908 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:21.446880 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:47:21.447040 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:21.446973 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:21.447040 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:21.447022 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert podName:2a601b18-e341-409a-8702-534bf559976c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:25.447007452 +0000 UTC m=+40.687123532 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert") pod "ingress-canary-sd2xm" (UID: "2a601b18-e341-409a-8702-534bf559976c") : secret "canary-serving-cert" not found Apr 22 18:47:21.555826 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:21.555796 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9qjxs" event={"ID":"d278e3ab-ea06-4b10-bfb7-327499648b8a","Type":"ContainerStarted","Data":"624d7af191b327bbfade4b63d95f83d3ed4a1713df5fe69a0074d58b85c203d9"} Apr 22 18:47:21.578114 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:21.578070 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9qjxs" podStartSLOduration=4.745664711 podStartE2EDuration="36.578057893s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.674014022 +0000 UTC m=+1.914130101" lastFinishedPulling="2026-04-22 18:47:18.5064072 +0000 UTC m=+33.746523283" observedRunningTime="2026-04-22 18:47:21.577301564 +0000 UTC m=+36.817417668" watchObservedRunningTime="2026-04-22 18:47:21.578057893 +0000 UTC m=+36.818173995" Apr 22 18:47:25.373055 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:25.373020 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:25.373479 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:25.373130 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:25.373479 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:25.373215 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls podName:39993c75-c30d-49df-8bc3-5e7250217350 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:33.373191916 +0000 UTC m=+48.613307996 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls") pod "dns-default-q62qx" (UID: "39993c75-c30d-49df-8bc3-5e7250217350") : secret "dns-default-metrics-tls" not found Apr 22 18:47:25.473810 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:25.473742 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:47:25.473990 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:25.473892 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:25.473990 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:25.473958 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert podName:2a601b18-e341-409a-8702-534bf559976c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:33.47394081 +0000 UTC m=+48.714056889 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert") pod "ingress-canary-sd2xm" (UID: "2a601b18-e341-409a-8702-534bf559976c") : secret "canary-serving-cert" not found Apr 22 18:47:26.178899 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:26.178867 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:26.182080 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:26.182053 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c9de063-c082-4665-b7f0-97a86598f6a1-original-pull-secret\") pod \"global-pull-secret-syncer-d4kbs\" (UID: \"0c9de063-c082-4665-b7f0-97a86598f6a1\") " pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:26.274493 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:26.274465 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d4kbs" Apr 22 18:47:26.444145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:26.444078 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d4kbs"] Apr 22 18:47:26.447649 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:47:26.447611 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c9de063_c082_4665_b7f0_97a86598f6a1.slice/crio-f086b04a53709d1066405aad6b013399af1076c7bd7f42837b4dc27effa3e99d WatchSource:0}: Error finding container f086b04a53709d1066405aad6b013399af1076c7bd7f42837b4dc27effa3e99d: Status 404 returned error can't find the container with id f086b04a53709d1066405aad6b013399af1076c7bd7f42837b4dc27effa3e99d Apr 22 18:47:26.565001 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:26.564968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d4kbs" event={"ID":"0c9de063-c082-4665-b7f0-97a86598f6a1","Type":"ContainerStarted","Data":"f086b04a53709d1066405aad6b013399af1076c7bd7f42837b4dc27effa3e99d"} Apr 22 18:47:31.576945 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:31.576871 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d4kbs" event={"ID":"0c9de063-c082-4665-b7f0-97a86598f6a1","Type":"ContainerStarted","Data":"5365477654c2606efe40f0e176b94f438b94fa09aff474f177dd881679e88670"} Apr 22 18:47:31.590038 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:31.589992 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-d4kbs" podStartSLOduration=33.452101623 podStartE2EDuration="37.589977558s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="2026-04-22 18:47:26.449912352 +0000 UTC m=+41.690028440" lastFinishedPulling="2026-04-22 18:47:30.587788294 +0000 UTC m=+45.827904375" observedRunningTime="2026-04-22 18:47:31.589396962 +0000 UTC m=+46.829513064" watchObservedRunningTime="2026-04-22 18:47:31.589977558 +0000 UTC m=+46.830093637" Apr 22 18:47:33.431506 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:33.431475 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:33.431954 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:33.431610 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:33.431954 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:33.431692 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls podName:39993c75-c30d-49df-8bc3-5e7250217350 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:49.431673249 +0000 UTC m=+64.671789330 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls") pod "dns-default-q62qx" (UID: "39993c75-c30d-49df-8bc3-5e7250217350") : secret "dns-default-metrics-tls" not found Apr 22 18:47:33.532263 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:33.532234 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:47:33.532383 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:33.532368 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:33.532434 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:33.532424 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert podName:2a601b18-e341-409a-8702-534bf559976c nodeName:}" failed. No retries permitted until 2026-04-22 18:47:49.532407898 +0000 UTC m=+64.772523994 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert") pod "ingress-canary-sd2xm" (UID: "2a601b18-e341-409a-8702-534bf559976c") : secret "canary-serving-cert" not found Apr 22 18:47:43.768062 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.768030 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9"] Apr 22 18:47:43.772288 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.772269 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p"] Apr 22 18:47:43.772439 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.772423 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:43.774638 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.774621 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:47:43.774961 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.774943 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:47:43.775039 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.774948 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:47:43.775039 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.775009 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 18:47:43.775412 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.775394 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:43.777804 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.777783 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 18:47:43.777907 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.777807 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 18:47:43.778043 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.778027 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 18:47:43.778192 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.778050 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 18:47:43.778192 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.778101 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9"] Apr 22 18:47:43.783676 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.783655 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p"] Apr 22 18:47:43.906678 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.906649 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg22p\" (UniqueName: \"kubernetes.io/projected/9e463904-9c12-4bd4-b135-ee7511969cf1-kube-api-access-sg22p\") pod \"klusterlet-addon-workmgr-6df46c6847-8dpm9\" (UID: \"9e463904-9c12-4bd4-b135-ee7511969cf1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:43.906678 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.906678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:43.906839 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.906720 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-ca\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:43.906839 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.906754 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9e463904-9c12-4bd4-b135-ee7511969cf1-tmp\") pod \"klusterlet-addon-workmgr-6df46c6847-8dpm9\" (UID: \"9e463904-9c12-4bd4-b135-ee7511969cf1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:43.906839 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.906771 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9e463904-9c12-4bd4-b135-ee7511969cf1-klusterlet-config\") pod \"klusterlet-addon-workmgr-6df46c6847-8dpm9\" (UID: \"9e463904-9c12-4bd4-b135-ee7511969cf1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:43.906839 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.906787 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-hub\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:43.906981 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.906875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:43.906981 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.906900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:43.906981 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:43.906918 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpchs\" (UniqueName: \"kubernetes.io/projected/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-kube-api-access-mpchs\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.007126 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.007102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9e463904-9c12-4bd4-b135-ee7511969cf1-tmp\") pod \"klusterlet-addon-workmgr-6df46c6847-8dpm9\" (UID: \"9e463904-9c12-4bd4-b135-ee7511969cf1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:44.007244 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.007128 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9e463904-9c12-4bd4-b135-ee7511969cf1-klusterlet-config\") pod \"klusterlet-addon-workmgr-6df46c6847-8dpm9\" (UID: \"9e463904-9c12-4bd4-b135-ee7511969cf1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:44.007244 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.007143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-hub\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.007244 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.007209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.007244 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.007232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.007426 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.007257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpchs\" (UniqueName: \"kubernetes.io/projected/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-kube-api-access-mpchs\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.007509 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.007484 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg22p\" (UniqueName: \"kubernetes.io/projected/9e463904-9c12-4bd4-b135-ee7511969cf1-kube-api-access-sg22p\") pod \"klusterlet-addon-workmgr-6df46c6847-8dpm9\" (UID: \"9e463904-9c12-4bd4-b135-ee7511969cf1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:44.007720 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.007695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.007823 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.007517 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9e463904-9c12-4bd4-b135-ee7511969cf1-tmp\") pod \"klusterlet-addon-workmgr-6df46c6847-8dpm9\" (UID: \"9e463904-9c12-4bd4-b135-ee7511969cf1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:44.007823 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.007759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-ca\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.008715 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.008690 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.010049 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.010024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9e463904-9c12-4bd4-b135-ee7511969cf1-klusterlet-config\") pod \"klusterlet-addon-workmgr-6df46c6847-8dpm9\" (UID: \"9e463904-9c12-4bd4-b135-ee7511969cf1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:44.010125 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.010049 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-hub\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.010188 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.010122 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.010332 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.010316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-ca\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.010404 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.010386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.015078 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.015059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpchs\" (UniqueName: \"kubernetes.io/projected/a6943a10-15d5-4283-9ddd-54b7ee9bf3e4-kube-api-access-mpchs\") pod \"cluster-proxy-proxy-agent-858d898d6f-fcp6p\" (UID: \"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.015153 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.015075 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg22p\" (UniqueName: \"kubernetes.io/projected/9e463904-9c12-4bd4-b135-ee7511969cf1-kube-api-access-sg22p\") pod \"klusterlet-addon-workmgr-6df46c6847-8dpm9\" (UID: \"9e463904-9c12-4bd4-b135-ee7511969cf1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:44.084280 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.084225 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:44.097942 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.097921 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:47:44.230064 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.230033 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p"] Apr 22 18:47:44.233550 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:47:44.233526 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6943a10_15d5_4283_9ddd_54b7ee9bf3e4.slice/crio-f159664e28c32a16171f9a2e0b77f9ae28ce07c4cbc16f09c8b5f0cfc51974e2 WatchSource:0}: Error finding container f159664e28c32a16171f9a2e0b77f9ae28ce07c4cbc16f09c8b5f0cfc51974e2: Status 404 returned error can't find the container with id f159664e28c32a16171f9a2e0b77f9ae28ce07c4cbc16f09c8b5f0cfc51974e2 Apr 22 18:47:44.244037 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.244013 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9"] Apr 22 18:47:44.246866 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:47:44.246842 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e463904_9c12_4bd4_b135_ee7511969cf1.slice/crio-440a06ae170361763ab3882ddf487284b6e5a4354e14da9b0d883f9592ad66c6 WatchSource:0}: Error finding container 440a06ae170361763ab3882ddf487284b6e5a4354e14da9b0d883f9592ad66c6: Status 404 returned error can't find the container with id 440a06ae170361763ab3882ddf487284b6e5a4354e14da9b0d883f9592ad66c6 Apr 22 18:47:44.603187 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.603127 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" event={"ID":"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4","Type":"ContainerStarted","Data":"f159664e28c32a16171f9a2e0b77f9ae28ce07c4cbc16f09c8b5f0cfc51974e2"} Apr 22 18:47:44.604066 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:44.604043 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" event={"ID":"9e463904-9c12-4bd4-b135-ee7511969cf1","Type":"ContainerStarted","Data":"440a06ae170361763ab3882ddf487284b6e5a4354e14da9b0d883f9592ad66c6"} Apr 22 18:47:45.657082 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:45.657051 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cr6wp" Apr 22 18:47:49.448461 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:49.448423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:47:49.449043 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:49.448583 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:49.449043 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:49.448676 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls podName:39993c75-c30d-49df-8bc3-5e7250217350 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:21.448653819 +0000 UTC m=+96.688769902 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls") pod "dns-default-q62qx" (UID: "39993c75-c30d-49df-8bc3-5e7250217350") : secret "dns-default-metrics-tls" not found Apr 22 18:47:49.549160 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:49.549132 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:47:49.549324 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:49.549304 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:49.549395 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:49.549383 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert podName:2a601b18-e341-409a-8702-534bf559976c nodeName:}" failed. No retries permitted until 2026-04-22 18:48:21.549364026 +0000 UTC m=+96.789480109 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert") pod "ingress-canary-sd2xm" (UID: "2a601b18-e341-409a-8702-534bf559976c") : secret "canary-serving-cert" not found Apr 22 18:47:49.616187 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:49.616120 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" event={"ID":"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4","Type":"ContainerStarted","Data":"64eaab5a1ea3cba5e4039070e81368234d1d9c442a991e47dd6cfa537057ff24"} Apr 22 18:47:49.617536 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:49.617512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" event={"ID":"9e463904-9c12-4bd4-b135-ee7511969cf1","Type":"ContainerStarted","Data":"f31f53bcce8213efc06b7e6d93bb1ead75b5ddf3e6289e9af89a7aea2b652ec6"} Apr 22 18:47:50.051925 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.051892 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88lkc\" (UniqueName: \"kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc\") pod \"network-check-target-cxglm\" (UID: \"54689db5-4b53-4548-b0fc-1a5da6d4dbcc\") " pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:50.054711 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.054694 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:50.064784 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.064767 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:50.075201 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.075160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88lkc\" (UniqueName: \"kubernetes.io/projected/54689db5-4b53-4548-b0fc-1a5da6d4dbcc-kube-api-access-88lkc\") pod \"network-check-target-cxglm\" (UID: \"54689db5-4b53-4548-b0fc-1a5da6d4dbcc\") " pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:50.152387 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.152357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:47:50.154906 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.154890 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:50.162945 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:50.162927 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:47:50.163019 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:47:50.162977 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs podName:c87517bd-8a13-4cb2-bf88-0b3d8c58b67c nodeName:}" failed. No retries permitted until 2026-04-22 18:48:54.162962156 +0000 UTC m=+129.403078235 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs") pod "network-metrics-daemon-mjbsn" (UID: "c87517bd-8a13-4cb2-bf88-0b3d8c58b67c") : secret "metrics-daemon-secret" not found Apr 22 18:47:50.271732 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.271709 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5ggfs\"" Apr 22 18:47:50.279526 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.279509 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:50.392358 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.392330 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cxglm"] Apr 22 18:47:50.405558 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:47:50.405524 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54689db5_4b53_4548_b0fc_1a5da6d4dbcc.slice/crio-efd4ce189684295bca87623a2c6c424b59b5401127c86aab7e4482790957e81e WatchSource:0}: Error finding container efd4ce189684295bca87623a2c6c424b59b5401127c86aab7e4482790957e81e: Status 404 returned error can't find the container with id efd4ce189684295bca87623a2c6c424b59b5401127c86aab7e4482790957e81e Apr 22 18:47:50.620438 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.620412 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cxglm" event={"ID":"54689db5-4b53-4548-b0fc-1a5da6d4dbcc","Type":"ContainerStarted","Data":"efd4ce189684295bca87623a2c6c424b59b5401127c86aab7e4482790957e81e"} Apr 22 18:47:50.620804 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.620610 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:50.622151 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.622132 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" Apr 22 18:47:50.634915 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:50.634874 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6df46c6847-8dpm9" podStartSLOduration=2.382980972 podStartE2EDuration="7.634862804s" podCreationTimestamp="2026-04-22 18:47:43 +0000 UTC" firstStartedPulling="2026-04-22 18:47:44.24839793 +0000 UTC m=+59.488514010" lastFinishedPulling="2026-04-22 18:47:49.500279759 +0000 UTC m=+64.740395842" observedRunningTime="2026-04-22 18:47:50.633694296 +0000 UTC m=+65.873810398" watchObservedRunningTime="2026-04-22 18:47:50.634862804 +0000 UTC m=+65.874978906" Apr 22 18:47:54.631408 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:54.631370 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cxglm" event={"ID":"54689db5-4b53-4548-b0fc-1a5da6d4dbcc","Type":"ContainerStarted","Data":"b77ddbe1a8d29c579f6401544bf9ec5835dc552f9a1fcf9e79cd0f4005815a3c"} Apr 22 18:47:54.631835 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:54.631441 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:47:54.633332 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:54.633301 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" event={"ID":"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4","Type":"ContainerStarted","Data":"a530c1761287102a3e7401705b1ce267194dace01c8576ce6027544e98b121e9"} Apr 22 18:47:54.633332 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:54.633333 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" event={"ID":"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4","Type":"ContainerStarted","Data":"905579dbdeaaed182cc4087def6e248057ab08305bc1e9bd12f84f02476c9bb1"} Apr 22 18:47:54.646043 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:54.645994 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cxglm" podStartSLOduration=66.301544879 podStartE2EDuration="1m9.645982969s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="2026-04-22 18:47:50.407429077 +0000 UTC m=+65.647545158" lastFinishedPulling="2026-04-22 18:47:53.751867169 +0000 UTC m=+68.991983248" observedRunningTime="2026-04-22 18:47:54.645002253 +0000 UTC m=+69.885118365" watchObservedRunningTime="2026-04-22 18:47:54.645982969 +0000 UTC m=+69.886099070" Apr 22 18:47:54.661003 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:47:54.660946 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" podStartSLOduration=2.150548767 podStartE2EDuration="11.660935659s" podCreationTimestamp="2026-04-22 18:47:43 +0000 UTC" firstStartedPulling="2026-04-22 18:47:44.234987432 +0000 UTC m=+59.475103515" lastFinishedPulling="2026-04-22 18:47:53.745374327 +0000 UTC m=+68.985490407" observedRunningTime="2026-04-22 18:47:54.659729201 +0000 UTC m=+69.899845325" watchObservedRunningTime="2026-04-22 18:47:54.660935659 +0000 UTC m=+69.901051782" Apr 22 18:48:21.470362 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:48:21.470322 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:48:21.470856 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:48:21.470457 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:48:21.470856 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:48:21.470532 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls podName:39993c75-c30d-49df-8bc3-5e7250217350 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:25.470511565 +0000 UTC m=+160.710627645 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls") pod "dns-default-q62qx" (UID: "39993c75-c30d-49df-8bc3-5e7250217350") : secret "dns-default-metrics-tls" not found Apr 22 18:48:21.570932 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:48:21.570905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:48:21.571077 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:48:21.571057 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:48:21.571141 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:48:21.571132 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert podName:2a601b18-e341-409a-8702-534bf559976c nodeName:}" failed. No retries permitted until 2026-04-22 18:49:25.571116406 +0000 UTC m=+160.811232491 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert") pod "ingress-canary-sd2xm" (UID: "2a601b18-e341-409a-8702-534bf559976c") : secret "canary-serving-cert" not found Apr 22 18:48:25.638670 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:48:25.638642 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cxglm" Apr 22 18:48:54.192462 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:48:54.192421 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:48:54.192941 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:48:54.192571 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:48:54.192941 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:48:54.192636 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs podName:c87517bd-8a13-4cb2-bf88-0b3d8c58b67c nodeName:}" failed. No retries permitted until 2026-04-22 18:50:56.192619668 +0000 UTC m=+251.432735748 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs") pod "network-metrics-daemon-mjbsn" (UID: "c87517bd-8a13-4cb2-bf88-0b3d8c58b67c") : secret "metrics-daemon-secret" not found Apr 22 18:48:59.508821 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:48:59.508794 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sklxm_c935a1fa-6e87-4c17-b437-53842e8bf6b5/dns-node-resolver/0.log" Apr 22 18:49:00.109344 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:00.109315 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fsm87_613626ea-ce6e-4907-a058-fed79d83cc79/node-ca/0.log" Apr 22 18:49:17.500280 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.500247 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-28bz4"] Apr 22 18:49:17.503513 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.503493 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.505789 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.505763 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:49:17.507060 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.507039 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:49:17.507060 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.507058 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zj8q7\"" Apr 22 18:49:17.507209 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.507043 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:49:17.507209 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.507039 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:49:17.514679 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.514657 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-28bz4"] Apr 22 18:49:17.658313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.658281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4e9618b6-810d-461a-ac5f-ce2cacfcc950-crio-socket\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.658471 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.658331 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e9618b6-810d-461a-ac5f-ce2cacfcc950-data-volume\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.658471 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.658367 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e9618b6-810d-461a-ac5f-ce2cacfcc950-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.658574 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.658464 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4e9618b6-810d-461a-ac5f-ce2cacfcc950-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.658574 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.658512 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc47l\" (UniqueName: \"kubernetes.io/projected/4e9618b6-810d-461a-ac5f-ce2cacfcc950-kube-api-access-xc47l\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.759628 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.759561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc47l\" (UniqueName: \"kubernetes.io/projected/4e9618b6-810d-461a-ac5f-ce2cacfcc950-kube-api-access-xc47l\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.759628 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.759609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4e9618b6-810d-461a-ac5f-ce2cacfcc950-crio-socket\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.759804 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.759634 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e9618b6-810d-461a-ac5f-ce2cacfcc950-data-volume\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.759804 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.759661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e9618b6-810d-461a-ac5f-ce2cacfcc950-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.759804 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.759751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4e9618b6-810d-461a-ac5f-ce2cacfcc950-crio-socket\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.759804 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.759750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4e9618b6-810d-461a-ac5f-ce2cacfcc950-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.760032 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.760012 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e9618b6-810d-461a-ac5f-ce2cacfcc950-data-volume\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.760304 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.760285 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4e9618b6-810d-461a-ac5f-ce2cacfcc950-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.762040 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.762021 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e9618b6-810d-461a-ac5f-ce2cacfcc950-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.767573 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.767556 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc47l\" (UniqueName: \"kubernetes.io/projected/4e9618b6-810d-461a-ac5f-ce2cacfcc950-kube-api-access-xc47l\") pod \"insights-runtime-extractor-28bz4\" (UID: \"4e9618b6-810d-461a-ac5f-ce2cacfcc950\") " pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.812555 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.812537 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-28bz4" Apr 22 18:49:17.925984 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:17.925951 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-28bz4"] Apr 22 18:49:17.928990 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:49:17.928965 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e9618b6_810d_461a_ac5f_ce2cacfcc950.slice/crio-a17ae755d6f116aa00014c1651f4bb4737cd0dcc8dc4158d04cda9965c5b720e WatchSource:0}: Error finding container a17ae755d6f116aa00014c1651f4bb4737cd0dcc8dc4158d04cda9965c5b720e: Status 404 returned error can't find the container with id a17ae755d6f116aa00014c1651f4bb4737cd0dcc8dc4158d04cda9965c5b720e Apr 22 18:49:18.828280 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:18.828188 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-28bz4" event={"ID":"4e9618b6-810d-461a-ac5f-ce2cacfcc950","Type":"ContainerStarted","Data":"ad2405f0b86b8e251b2d9c2b209fb3241221a295627280bdadb97fcd0bb7bbad"} Apr 22 18:49:18.828280 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:18.828230 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-28bz4" event={"ID":"4e9618b6-810d-461a-ac5f-ce2cacfcc950","Type":"ContainerStarted","Data":"f5f5c87529b159ea7549991958d815737b93d3f866f12c2da9d600c654c60c8c"} Apr 22 18:49:18.828280 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:18.828240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-28bz4" event={"ID":"4e9618b6-810d-461a-ac5f-ce2cacfcc950","Type":"ContainerStarted","Data":"a17ae755d6f116aa00014c1651f4bb4737cd0dcc8dc4158d04cda9965c5b720e"} Apr 22 18:49:20.654357 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:49:20.654323 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-q62qx" podUID="39993c75-c30d-49df-8bc3-5e7250217350" Apr 22 18:49:20.669474 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:49:20.669443 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-sd2xm" podUID="2a601b18-e341-409a-8702-534bf559976c" Apr 22 18:49:20.834829 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:20.834795 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-28bz4" event={"ID":"4e9618b6-810d-461a-ac5f-ce2cacfcc950","Type":"ContainerStarted","Data":"45f927d4691ca736ce029adfea0893e5bee4c03e570e49d7f01a5880259819b1"} Apr 22 18:49:20.834991 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:20.834868 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:49:20.835181 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:20.835151 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-q62qx" Apr 22 18:49:20.849940 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:20.849893 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-28bz4" podStartSLOduration=1.782628421 podStartE2EDuration="3.84988182s" podCreationTimestamp="2026-04-22 18:49:17 +0000 UTC" firstStartedPulling="2026-04-22 18:49:17.978712896 +0000 UTC m=+153.218828976" lastFinishedPulling="2026-04-22 18:49:20.045966283 +0000 UTC m=+155.286082375" observedRunningTime="2026-04-22 18:49:20.849207212 +0000 UTC m=+156.089323315" watchObservedRunningTime="2026-04-22 18:49:20.84988182 +0000 UTC m=+156.089997921" Apr 22 18:49:22.378932 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:49:22.378887 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-mjbsn" podUID="c87517bd-8a13-4cb2-bf88-0b3d8c58b67c" Apr 22 18:49:24.099718 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:24.099680 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" podUID="a6943a10-15d5-4283-9ddd-54b7ee9bf3e4" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:49:25.512717 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:25.512683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:49:25.515114 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:25.515092 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39993c75-c30d-49df-8bc3-5e7250217350-metrics-tls\") pod \"dns-default-q62qx\" (UID: \"39993c75-c30d-49df-8bc3-5e7250217350\") " pod="openshift-dns/dns-default-q62qx" Apr 22 18:49:25.613383 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:25.613351 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:49:25.615714 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:25.615693 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a601b18-e341-409a-8702-534bf559976c-cert\") pod \"ingress-canary-sd2xm\" (UID: \"2a601b18-e341-409a-8702-534bf559976c\") " pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:49:25.637631 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:25.637607 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zg5rc\"" Apr 22 18:49:25.638597 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:25.638583 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rxjsb\"" Apr 22 18:49:25.645344 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:25.645330 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sd2xm" Apr 22 18:49:25.645431 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:25.645418 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-q62qx" Apr 22 18:49:25.773150 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:25.773063 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-q62qx"] Apr 22 18:49:25.776199 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:49:25.776144 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39993c75_c30d_49df_8bc3_5e7250217350.slice/crio-7ed0f33e47a5f36a203a55bd9ed24b1bb5d0f26a12eb3c33e5a0f988d81f5647 WatchSource:0}: Error finding container 7ed0f33e47a5f36a203a55bd9ed24b1bb5d0f26a12eb3c33e5a0f988d81f5647: Status 404 returned error can't find the container with id 7ed0f33e47a5f36a203a55bd9ed24b1bb5d0f26a12eb3c33e5a0f988d81f5647 Apr 22 18:49:25.787263 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:25.787236 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sd2xm"] Apr 22 18:49:25.790134 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:49:25.790111 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a601b18_e341_409a_8702_534bf559976c.slice/crio-30ce22e12abe803053fb041c8a5a8e6ef080fdeb13255ec7839e7b90947c891a WatchSource:0}: Error finding container 30ce22e12abe803053fb041c8a5a8e6ef080fdeb13255ec7839e7b90947c891a: Status 404 returned error can't find the container with id 30ce22e12abe803053fb041c8a5a8e6ef080fdeb13255ec7839e7b90947c891a Apr 22 18:49:25.850580 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:25.850539 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q62qx" event={"ID":"39993c75-c30d-49df-8bc3-5e7250217350","Type":"ContainerStarted","Data":"7ed0f33e47a5f36a203a55bd9ed24b1bb5d0f26a12eb3c33e5a0f988d81f5647"} Apr 22 18:49:25.851430 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:25.851405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sd2xm" event={"ID":"2a601b18-e341-409a-8702-534bf559976c","Type":"ContainerStarted","Data":"30ce22e12abe803053fb041c8a5a8e6ef080fdeb13255ec7839e7b90947c891a"} Apr 22 18:49:26.661269 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:26.661232 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld"] Apr 22 18:49:26.664523 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:26.664504 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld" Apr 22 18:49:26.667159 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:26.667134 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 18:49:26.667159 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:26.667153 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-7fpb8\"" Apr 22 18:49:26.673698 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:26.672909 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld"] Apr 22 18:49:26.821900 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:26.821843 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4916102b-7785-4334-9fcc-bd481df20b31-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-tftld\" (UID: \"4916102b-7785-4334-9fcc-bd481df20b31\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld" Apr 22 18:49:26.922794 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:26.922706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4916102b-7785-4334-9fcc-bd481df20b31-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-tftld\" (UID: \"4916102b-7785-4334-9fcc-bd481df20b31\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld" Apr 22 18:49:26.922959 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:49:26.922844 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:49:26.922959 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:49:26.922923 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4916102b-7785-4334-9fcc-bd481df20b31-tls-certificates podName:4916102b-7785-4334-9fcc-bd481df20b31 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:27.422898485 +0000 UTC m=+162.663014568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/4916102b-7785-4334-9fcc-bd481df20b31-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-tftld" (UID: "4916102b-7785-4334-9fcc-bd481df20b31") : secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:49:27.427440 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:27.427393 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4916102b-7785-4334-9fcc-bd481df20b31-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-tftld\" (UID: \"4916102b-7785-4334-9fcc-bd481df20b31\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld" Apr 22 18:49:27.430429 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:27.430395 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4916102b-7785-4334-9fcc-bd481df20b31-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-tftld\" (UID: \"4916102b-7785-4334-9fcc-bd481df20b31\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld" Apr 22 18:49:27.576586 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:27.576556 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld" Apr 22 18:49:27.718740 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:27.718658 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld"] Apr 22 18:49:27.858997 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:27.858958 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q62qx" event={"ID":"39993c75-c30d-49df-8bc3-5e7250217350","Type":"ContainerStarted","Data":"bb838b3716214179a7ed1ec1bc95852eeddaf004c46b5a8c32dd52de3a4bf4a5"} Apr 22 18:49:27.858997 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:27.859000 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q62qx" event={"ID":"39993c75-c30d-49df-8bc3-5e7250217350","Type":"ContainerStarted","Data":"14ee74b87d9e24d053547ffb93011b786eef55d4520c8cf21f391d6141352641"} Apr 22 18:49:27.859358 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:27.859189 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-q62qx" Apr 22 18:49:27.860215 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:27.860192 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sd2xm" event={"ID":"2a601b18-e341-409a-8702-534bf559976c","Type":"ContainerStarted","Data":"0306e7753c11865113b5ec9fe959ded6b20add19642b96ded59dbc2dac3d4f4f"} Apr 22 18:49:27.861091 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:27.861063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld" event={"ID":"4916102b-7785-4334-9fcc-bd481df20b31","Type":"ContainerStarted","Data":"21625b078d4461bbdc4722bed92218ab11a05b0e6722593d33dec7f7696604c2"} Apr 22 18:49:27.873732 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:27.873692 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-q62qx" podStartSLOduration=129.118550299 podStartE2EDuration="2m10.87368039s" podCreationTimestamp="2026-04-22 18:47:17 +0000 UTC" firstStartedPulling="2026-04-22 18:49:25.778286019 +0000 UTC m=+161.018402099" lastFinishedPulling="2026-04-22 18:49:27.533416107 +0000 UTC m=+162.773532190" observedRunningTime="2026-04-22 18:49:27.873026287 +0000 UTC m=+163.113142390" watchObservedRunningTime="2026-04-22 18:49:27.87368039 +0000 UTC m=+163.113796530" Apr 22 18:49:27.886509 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:27.886475 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sd2xm" podStartSLOduration=129.141262313 podStartE2EDuration="2m10.886465125s" podCreationTimestamp="2026-04-22 18:47:17 +0000 UTC" firstStartedPulling="2026-04-22 18:49:25.791931748 +0000 UTC m=+161.032047831" lastFinishedPulling="2026-04-22 18:49:27.537134563 +0000 UTC m=+162.777250643" observedRunningTime="2026-04-22 18:49:27.885899787 +0000 UTC m=+163.126015889" watchObservedRunningTime="2026-04-22 18:49:27.886465125 +0000 UTC m=+163.126581226" Apr 22 18:49:28.865056 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:28.865024 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld" event={"ID":"4916102b-7785-4334-9fcc-bd481df20b31","Type":"ContainerStarted","Data":"4c33bd0ff9f7e993884437826d1e7e13a0c79a92ddb179d67e95feb1161d235b"} Apr 22 18:49:28.882063 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:28.882019 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld" podStartSLOduration=1.840526825 podStartE2EDuration="2.882004436s" podCreationTimestamp="2026-04-22 18:49:26 +0000 UTC" firstStartedPulling="2026-04-22 18:49:27.7255929 +0000 UTC m=+162.965708987" lastFinishedPulling="2026-04-22 18:49:28.767070515 +0000 UTC m=+164.007186598" observedRunningTime="2026-04-22 18:49:28.880612321 +0000 UTC m=+164.120728422" watchObservedRunningTime="2026-04-22 18:49:28.882004436 +0000 UTC m=+164.122120537" Apr 22 18:49:29.867885 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:29.867847 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld" Apr 22 18:49:29.872219 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:29.872200 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tftld" Apr 22 18:49:34.099114 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:34.099037 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" podUID="a6943a10-15d5-4283-9ddd-54b7ee9bf3e4" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:49:35.055298 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.055266 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-kjj54"] Apr 22 18:49:35.058253 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.058233 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.058400 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.058375 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jqtr9"] Apr 22 18:49:35.060992 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.060963 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:49:35.060992 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.060983 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:49:35.061193 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.061037 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sw8gt\"" Apr 22 18:49:35.061258 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.061246 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.062308 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.062285 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:49:35.062409 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.062349 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:49:35.062409 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.062289 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:49:35.062527 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.062410 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:49:35.064028 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.064008 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 18:49:35.064028 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.064026 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-f77t5\"" Apr 22 18:49:35.064278 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.064008 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:49:35.064372 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.064357 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 18:49:35.073005 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.072986 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jqtr9"] Apr 22 18:49:35.181027 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.180994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68fb2641-d900-4570-9106-dfa68f2a21a2-metrics-client-ca\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.181390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181033 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-wtmp\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.181390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181059 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1ec1a089-f815-406b-a0ff-8ee4d88530dc-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.181390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181127 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npf4p\" (UniqueName: \"kubernetes.io/projected/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-api-access-npf4p\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.181390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181157 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.181390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9trv\" (UniqueName: \"kubernetes.io/projected/68fb2641-d900-4570-9106-dfa68f2a21a2-kube-api-access-s9trv\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.181390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181221 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-tls\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.181390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181270 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.181390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181304 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.181390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181323 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68fb2641-d900-4570-9106-dfa68f2a21a2-sys\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.181390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181354 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-textfile\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.181390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181378 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-accelerators-collector-config\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.181768 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181425 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.181768 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181454 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/68fb2641-d900-4570-9106-dfa68f2a21a2-root\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.181768 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.181483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ec1a089-f815-406b-a0ff-8ee4d88530dc-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.282473 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-tls\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.282612 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.282612 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.282612 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68fb2641-d900-4570-9106-dfa68f2a21a2-sys\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.282612 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282541 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-textfile\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.282612 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-accelerators-collector-config\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.282612 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:49:35.282606 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68fb2641-d900-4570-9106-dfa68f2a21a2-sys\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282665 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/68fb2641-d900-4570-9106-dfa68f2a21a2-root\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:49:35.282683 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-tls podName:68fb2641-d900-4570-9106-dfa68f2a21a2 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:35.78266091 +0000 UTC m=+171.022776997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-tls") pod "node-exporter-kjj54" (UID: "68fb2641-d900-4570-9106-dfa68f2a21a2") : secret "node-exporter-tls" not found Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/68fb2641-d900-4570-9106-dfa68f2a21a2-root\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ec1a089-f815-406b-a0ff-8ee4d88530dc-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68fb2641-d900-4570-9106-dfa68f2a21a2-metrics-client-ca\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-wtmp\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1ec1a089-f815-406b-a0ff-8ee4d88530dc-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282851 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-textfile\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282865 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npf4p\" (UniqueName: \"kubernetes.io/projected/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-api-access-npf4p\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282912 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282944 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9trv\" (UniqueName: \"kubernetes.io/projected/68fb2641-d900-4570-9106-dfa68f2a21a2-kube-api-access-s9trv\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.282986 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.282974 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-wtmp\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.283629 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:49:35.283050 2573 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 18:49:35.283629 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:49:35.283108 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-state-metrics-tls podName:1ec1a089-f815-406b-a0ff-8ee4d88530dc nodeName:}" failed. No retries permitted until 2026-04-22 18:49:35.783092128 +0000 UTC m=+171.023208223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-jqtr9" (UID: "1ec1a089-f815-406b-a0ff-8ee4d88530dc") : secret "kube-state-metrics-tls" not found Apr 22 18:49:35.283629 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.283230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-accelerators-collector-config\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.283629 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.283369 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.283629 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.283385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68fb2641-d900-4570-9106-dfa68f2a21a2-metrics-client-ca\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.283629 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.283429 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1ec1a089-f815-406b-a0ff-8ee4d88530dc-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.283845 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.283795 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ec1a089-f815-406b-a0ff-8ee4d88530dc-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.285236 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.285218 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.285466 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.285447 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.292631 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.292610 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npf4p\" (UniqueName: \"kubernetes.io/projected/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-api-access-npf4p\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.292820 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.292802 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9trv\" (UniqueName: \"kubernetes.io/projected/68fb2641-d900-4570-9106-dfa68f2a21a2-kube-api-access-s9trv\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.787355 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.787309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.787355 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.787361 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-tls\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.789768 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.789745 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68fb2641-d900-4570-9106-dfa68f2a21a2-node-exporter-tls\") pod \"node-exporter-kjj54\" (UID: \"68fb2641-d900-4570-9106-dfa68f2a21a2\") " pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.789900 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.789854 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ec1a089-f815-406b-a0ff-8ee4d88530dc-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jqtr9\" (UID: \"1ec1a089-f815-406b-a0ff-8ee4d88530dc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:35.968816 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.968778 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kjj54" Apr 22 18:49:35.974463 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:35.974446 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" Apr 22 18:49:36.118111 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:36.118083 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jqtr9"] Apr 22 18:49:36.120732 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:49:36.120705 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ec1a089_f815_406b_a0ff_8ee4d88530dc.slice/crio-022a1496fb7edc5c1867d678bdcf14ac1ab177cbe9d84bd1f427b4b17e3705fb WatchSource:0}: Error finding container 022a1496fb7edc5c1867d678bdcf14ac1ab177cbe9d84bd1f427b4b17e3705fb: Status 404 returned error can't find the container with id 022a1496fb7edc5c1867d678bdcf14ac1ab177cbe9d84bd1f427b4b17e3705fb Apr 22 18:49:36.888694 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:36.888653 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kjj54" event={"ID":"68fb2641-d900-4570-9106-dfa68f2a21a2","Type":"ContainerStarted","Data":"73f346bdcb960e34205acc49a16e7aaa4bc3a48c9e48257909cd32154eeb7ffd"} Apr 22 18:49:36.889840 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:36.889813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" event={"ID":"1ec1a089-f815-406b-a0ff-8ee4d88530dc","Type":"ContainerStarted","Data":"022a1496fb7edc5c1867d678bdcf14ac1ab177cbe9d84bd1f427b4b17e3705fb"} Apr 22 18:49:37.118844 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.118756 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-8fbdfc678-rs8ng"] Apr 22 18:49:37.125016 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.124985 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.127490 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.127466 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 18:49:37.127623 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.127524 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 18:49:37.127623 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.127510 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 18:49:37.127951 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.127926 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-fojmarkm8pvk8\"" Apr 22 18:49:37.128055 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.127974 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 18:49:37.128055 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.128014 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 18:49:37.128727 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.128663 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-gvjq9\"" Apr 22 18:49:37.131044 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.131025 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-8fbdfc678-rs8ng"] Apr 22 18:49:37.299495 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.299427 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.299495 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.299466 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.299632 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.299495 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-tls\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.299632 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.299582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-grpc-tls\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.299700 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.299631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.299700 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.299651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvs2x\" (UniqueName: \"kubernetes.io/projected/df0c630f-3b5c-4c80-a392-4f1cbddb040e-kube-api-access-jvs2x\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.299700 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.299670 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.299812 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.299752 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df0c630f-3b5c-4c80-a392-4f1cbddb040e-metrics-client-ca\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.360098 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.359945 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:49:37.400600 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.400570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.400718 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.400620 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.400718 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.400651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-tls\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.400882 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.400862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-grpc-tls\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.400972 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.400936 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.401026 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.400971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvs2x\" (UniqueName: \"kubernetes.io/projected/df0c630f-3b5c-4c80-a392-4f1cbddb040e-kube-api-access-jvs2x\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.401026 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.401004 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.401121 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.401061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df0c630f-3b5c-4c80-a392-4f1cbddb040e-metrics-client-ca\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.401900 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.401849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df0c630f-3b5c-4c80-a392-4f1cbddb040e-metrics-client-ca\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.403901 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.403856 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.404912 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.404523 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.404912 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.404764 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.404912 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.404834 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-tls\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.405341 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.405304 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.406707 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.406678 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/df0c630f-3b5c-4c80-a392-4f1cbddb040e-secret-grpc-tls\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.412081 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.412039 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvs2x\" (UniqueName: \"kubernetes.io/projected/df0c630f-3b5c-4c80-a392-4f1cbddb040e-kube-api-access-jvs2x\") pod \"thanos-querier-8fbdfc678-rs8ng\" (UID: \"df0c630f-3b5c-4c80-a392-4f1cbddb040e\") " pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.437329 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.437300 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:37.565307 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.565260 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-8fbdfc678-rs8ng"] Apr 22 18:49:37.569720 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:49:37.569689 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf0c630f_3b5c_4c80_a392_4f1cbddb040e.slice/crio-d434b5d594cb52abba97655487213824240acb424b5bf2b3f8ff9ab38e5f93bb WatchSource:0}: Error finding container d434b5d594cb52abba97655487213824240acb424b5bf2b3f8ff9ab38e5f93bb: Status 404 returned error can't find the container with id d434b5d594cb52abba97655487213824240acb424b5bf2b3f8ff9ab38e5f93bb Apr 22 18:49:37.867720 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.867643 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-q62qx" Apr 22 18:49:37.893441 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.893405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" event={"ID":"df0c630f-3b5c-4c80-a392-4f1cbddb040e","Type":"ContainerStarted","Data":"d434b5d594cb52abba97655487213824240acb424b5bf2b3f8ff9ab38e5f93bb"} Apr 22 18:49:37.895722 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.895690 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" event={"ID":"1ec1a089-f815-406b-a0ff-8ee4d88530dc","Type":"ContainerStarted","Data":"bfd54b7d3cd8da3af97d51f958e6e526fa5a19eefefc13d96f87759fc6439f10"} Apr 22 18:49:37.895848 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.895729 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" event={"ID":"1ec1a089-f815-406b-a0ff-8ee4d88530dc","Type":"ContainerStarted","Data":"1a4fb6be07bda8780052b3e13bc69fd2da922bd6be766ebdcc10226915117bba"} Apr 22 18:49:37.895848 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.895743 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" event={"ID":"1ec1a089-f815-406b-a0ff-8ee4d88530dc","Type":"ContainerStarted","Data":"3a1640367d586cd3b04a52b576f4732aeb037d09d6103c954004704b1e4b6c1f"} Apr 22 18:49:37.897315 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.897286 2573 generic.go:358] "Generic (PLEG): container finished" podID="68fb2641-d900-4570-9106-dfa68f2a21a2" containerID="82f1f67adcd419e8ab3765be715a6e86ce3785a3313078d972186a5fbb61d136" exitCode=0 Apr 22 18:49:37.897417 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.897337 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kjj54" event={"ID":"68fb2641-d900-4570-9106-dfa68f2a21a2","Type":"ContainerDied","Data":"82f1f67adcd419e8ab3765be715a6e86ce3785a3313078d972186a5fbb61d136"} Apr 22 18:49:37.916027 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:37.915971 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqtr9" podStartSLOduration=1.830588171 podStartE2EDuration="2.915953538s" podCreationTimestamp="2026-04-22 18:49:35 +0000 UTC" firstStartedPulling="2026-04-22 18:49:36.122668572 +0000 UTC m=+171.362784652" lastFinishedPulling="2026-04-22 18:49:37.20803392 +0000 UTC m=+172.448150019" observedRunningTime="2026-04-22 18:49:37.915294602 +0000 UTC m=+173.155410740" watchObservedRunningTime="2026-04-22 18:49:37.915953538 +0000 UTC m=+173.156069641" Apr 22 18:49:38.902515 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:38.902471 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kjj54" event={"ID":"68fb2641-d900-4570-9106-dfa68f2a21a2","Type":"ContainerStarted","Data":"87ec0433a11ab94cae67877267d8fbae702db8874f4ce7db3406068101585920"} Apr 22 18:49:38.902515 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:38.902522 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kjj54" event={"ID":"68fb2641-d900-4570-9106-dfa68f2a21a2","Type":"ContainerStarted","Data":"d1fc94b12a82a7a058b9f698f1949e7bca74c28842f820a5e98d61906a7d52f5"} Apr 22 18:49:38.920339 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:38.920287 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-kjj54" podStartSLOduration=3.071147435 podStartE2EDuration="3.920272663s" podCreationTimestamp="2026-04-22 18:49:35 +0000 UTC" firstStartedPulling="2026-04-22 18:49:35.980702954 +0000 UTC m=+171.220819034" lastFinishedPulling="2026-04-22 18:49:36.829828183 +0000 UTC m=+172.069944262" observedRunningTime="2026-04-22 18:49:38.920015632 +0000 UTC m=+174.160131755" watchObservedRunningTime="2026-04-22 18:49:38.920272663 +0000 UTC m=+174.160388764" Apr 22 18:49:39.908353 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:39.908316 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" event={"ID":"df0c630f-3b5c-4c80-a392-4f1cbddb040e","Type":"ContainerStarted","Data":"3de532f3185ca9a794146fe982b5f13c289521df6487a5032567e8e719e48925"} Apr 22 18:49:39.908353 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:39.908361 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" event={"ID":"df0c630f-3b5c-4c80-a392-4f1cbddb040e","Type":"ContainerStarted","Data":"06b42c50b1f3339d48025fc25d0f95f8420558f8902d3f19168d8c6090fe75be"} Apr 22 18:49:39.908753 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:39.908373 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" event={"ID":"df0c630f-3b5c-4c80-a392-4f1cbddb040e","Type":"ContainerStarted","Data":"09553e53198e225e6f9cef8091e1046c975a98f6301aa31ca67a29e3622b628c"} Apr 22 18:49:40.913266 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:40.913228 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" event={"ID":"df0c630f-3b5c-4c80-a392-4f1cbddb040e","Type":"ContainerStarted","Data":"551f522061c18199495c5b90010f42714dd079383d8f329816c295ebcef33435"} Apr 22 18:49:40.913266 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:40.913267 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" event={"ID":"df0c630f-3b5c-4c80-a392-4f1cbddb040e","Type":"ContainerStarted","Data":"e7cc0fe9982f7105832ce450f5e5894380b3bd2fb8add88fdc796f3463f61d2e"} Apr 22 18:49:40.913688 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:40.913280 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" event={"ID":"df0c630f-3b5c-4c80-a392-4f1cbddb040e","Type":"ContainerStarted","Data":"a8efdfce1d282e351fcdf67e225dea9ba6f52e4b1b69a2a7f3f760032497092a"} Apr 22 18:49:40.913688 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:40.913495 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:49:40.936797 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:40.936754 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" podStartSLOduration=1.257777788 podStartE2EDuration="3.936741269s" podCreationTimestamp="2026-04-22 18:49:37 +0000 UTC" firstStartedPulling="2026-04-22 18:49:37.571662713 +0000 UTC m=+172.811778799" lastFinishedPulling="2026-04-22 18:49:40.250626192 +0000 UTC m=+175.490742280" observedRunningTime="2026-04-22 18:49:40.934556044 +0000 UTC m=+176.174672146" watchObservedRunningTime="2026-04-22 18:49:40.936741269 +0000 UTC m=+176.176857370" Apr 22 18:49:44.099107 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:44.099069 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" podUID="a6943a10-15d5-4283-9ddd-54b7ee9bf3e4" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:49:44.099641 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:44.099134 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" Apr 22 18:49:44.099641 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:44.099605 2573 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"a530c1761287102a3e7401705b1ce267194dace01c8576ce6027544e98b121e9"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 18:49:44.099742 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:44.099663 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" podUID="a6943a10-15d5-4283-9ddd-54b7ee9bf3e4" containerName="service-proxy" containerID="cri-o://a530c1761287102a3e7401705b1ce267194dace01c8576ce6027544e98b121e9" gracePeriod=30 Apr 22 18:49:44.925955 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:44.925921 2573 generic.go:358] "Generic (PLEG): container finished" podID="a6943a10-15d5-4283-9ddd-54b7ee9bf3e4" containerID="a530c1761287102a3e7401705b1ce267194dace01c8576ce6027544e98b121e9" exitCode=2 Apr 22 18:49:44.926119 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:44.925986 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" event={"ID":"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4","Type":"ContainerDied","Data":"a530c1761287102a3e7401705b1ce267194dace01c8576ce6027544e98b121e9"} Apr 22 18:49:44.926119 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:44.926025 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858d898d6f-fcp6p" event={"ID":"a6943a10-15d5-4283-9ddd-54b7ee9bf3e4","Type":"ContainerStarted","Data":"be9d24fa15c56d72afdc7490d134e33b1a8eba3f421d85793e235735e380e68b"} Apr 22 18:49:46.921410 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:49:46.921382 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-8fbdfc678-rs8ng" Apr 22 18:50:56.287157 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:50:56.287115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:50:56.289595 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:50:56.289576 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c87517bd-8a13-4cb2-bf88-0b3d8c58b67c-metrics-certs\") pod \"network-metrics-daemon-mjbsn\" (UID: \"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c\") " pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:50:56.563901 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:50:56.563820 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-m94d4\"" Apr 22 18:50:56.571831 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:50:56.571806 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjbsn" Apr 22 18:50:56.684545 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:50:56.684519 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mjbsn"] Apr 22 18:50:56.688337 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:50:56.688305 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc87517bd_8a13_4cb2_bf88_0b3d8c58b67c.slice/crio-116f745864a61e635c596bcd5ecdd0c2bc56443c7b31ed063fdb8e748e07c331 WatchSource:0}: Error finding container 116f745864a61e635c596bcd5ecdd0c2bc56443c7b31ed063fdb8e748e07c331: Status 404 returned error can't find the container with id 116f745864a61e635c596bcd5ecdd0c2bc56443c7b31ed063fdb8e748e07c331 Apr 22 18:50:57.113658 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:50:57.113611 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mjbsn" event={"ID":"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c","Type":"ContainerStarted","Data":"116f745864a61e635c596bcd5ecdd0c2bc56443c7b31ed063fdb8e748e07c331"} Apr 22 18:50:58.118357 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:50:58.118319 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mjbsn" event={"ID":"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c","Type":"ContainerStarted","Data":"f8cafb2d60353a5a5c35a990e0f5b0d5b39376a5820fc6d8a2a739f3588e2a9e"} Apr 22 18:50:58.118357 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:50:58.118355 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mjbsn" event={"ID":"c87517bd-8a13-4cb2-bf88-0b3d8c58b67c","Type":"ContainerStarted","Data":"d0229320e65d9047c58ff8c767b7fa06029e1e8b73ec5cd2186881cb956aab27"} Apr 22 18:50:58.133775 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:50:58.133722 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mjbsn" podStartSLOduration=252.296083118 podStartE2EDuration="4m13.13370888s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="2026-04-22 18:50:56.690155535 +0000 UTC m=+251.930271616" lastFinishedPulling="2026-04-22 18:50:57.527781298 +0000 UTC m=+252.767897378" observedRunningTime="2026-04-22 18:50:58.132421553 +0000 UTC m=+253.372537656" watchObservedRunningTime="2026-04-22 18:50:58.13370888 +0000 UTC m=+253.373824982" Apr 22 18:51:45.282112 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:51:45.282081 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 18:51:45.282879 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:51:45.282865 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 18:51:45.285945 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:51:45.285925 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:52:11.434863 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.434830 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg"] Apr 22 18:52:11.436652 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.436637 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:11.439037 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.439016 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:52:11.439237 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.439221 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:52:11.440225 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.440202 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:52:11.440313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.440226 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-rb9zh\"" Apr 22 18:52:11.440313 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.440203 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:52:11.440405 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.440203 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 18:52:11.444736 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.444714 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg"] Apr 22 18:52:11.512348 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.512323 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x585\" (UniqueName: \"kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-kube-api-access-6x585\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:11.512495 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.512376 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/7a35b553-685d-4b73-9801-c178c9491499-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:11.512495 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.512395 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:11.613722 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.613695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/7a35b553-685d-4b73-9801-c178c9491499-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:11.613722 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.613724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:11.613909 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.613759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6x585\" (UniqueName: \"kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-kube-api-access-6x585\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:11.613909 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:11.613846 2573 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:52:11.613909 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:11.613863 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:52:11.613909 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:11.613881 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg: references non-existent secret key: tls.crt Apr 22 18:52:11.614055 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:11.613942 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates podName:7a35b553-685d-4b73-9801-c178c9491499 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:12.113924974 +0000 UTC m=+327.354041054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates") pod "keda-metrics-apiserver-7c9f485588-4h4sg" (UID: "7a35b553-685d-4b73-9801-c178c9491499") : references non-existent secret key: tls.crt Apr 22 18:52:11.614055 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.614012 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/7a35b553-685d-4b73-9801-c178c9491499-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:11.624587 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:11.624558 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x585\" (UniqueName: \"kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-kube-api-access-6x585\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:12.116716 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:12.116686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:12.116893 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:12.116796 2573 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:52:12.116893 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:12.116807 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:52:12.116893 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:12.116824 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg: references non-existent secret key: tls.crt Apr 22 18:52:12.116893 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:12.116879 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates podName:7a35b553-685d-4b73-9801-c178c9491499 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:13.116864834 +0000 UTC m=+328.356980914 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates") pod "keda-metrics-apiserver-7c9f485588-4h4sg" (UID: "7a35b553-685d-4b73-9801-c178c9491499") : references non-existent secret key: tls.crt Apr 22 18:52:13.125608 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:13.125570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:13.126050 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:13.125736 2573 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:52:13.126050 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:13.125756 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:52:13.126050 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:13.125776 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg: references non-existent secret key: tls.crt Apr 22 18:52:13.126050 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:13.125841 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates podName:7a35b553-685d-4b73-9801-c178c9491499 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:15.1258175 +0000 UTC m=+330.365933580 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates") pod "keda-metrics-apiserver-7c9f485588-4h4sg" (UID: "7a35b553-685d-4b73-9801-c178c9491499") : references non-existent secret key: tls.crt Apr 22 18:52:15.139091 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:15.139048 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:15.139506 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:15.139206 2573 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:52:15.139506 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:15.139218 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:52:15.139506 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:15.139235 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg: references non-existent secret key: tls.crt Apr 22 18:52:15.139506 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:52:15.139288 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates podName:7a35b553-685d-4b73-9801-c178c9491499 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:19.139273033 +0000 UTC m=+334.379389113 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates") pod "keda-metrics-apiserver-7c9f485588-4h4sg" (UID: "7a35b553-685d-4b73-9801-c178c9491499") : references non-existent secret key: tls.crt Apr 22 18:52:19.169507 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:19.169466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:19.172120 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:19.172100 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7a35b553-685d-4b73-9801-c178c9491499-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4h4sg\" (UID: \"7a35b553-685d-4b73-9801-c178c9491499\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:19.246874 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:19.246850 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:19.363600 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:19.363575 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg"] Apr 22 18:52:19.363943 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:52:19.363903 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a35b553_685d_4b73_9801_c178c9491499.slice/crio-c6a69bab71a695a7275b97acdfae23e492853779a4d6b8281098332a6f5d777f WatchSource:0}: Error finding container c6a69bab71a695a7275b97acdfae23e492853779a4d6b8281098332a6f5d777f: Status 404 returned error can't find the container with id c6a69bab71a695a7275b97acdfae23e492853779a4d6b8281098332a6f5d777f Apr 22 18:52:19.365120 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:19.365104 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:52:20.337721 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:20.337679 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" event={"ID":"7a35b553-685d-4b73-9801-c178c9491499","Type":"ContainerStarted","Data":"c6a69bab71a695a7275b97acdfae23e492853779a4d6b8281098332a6f5d777f"} Apr 22 18:52:23.348122 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:23.348083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" event={"ID":"7a35b553-685d-4b73-9801-c178c9491499","Type":"ContainerStarted","Data":"16dd5a9ff961e0ffd1a4ddee5e4fff2624acc5de52a6a93192926dc8fc5471d2"} Apr 22 18:52:23.348536 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:23.348208 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:52:23.364240 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:23.364150 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" podStartSLOduration=9.334933921 podStartE2EDuration="12.364135948s" podCreationTimestamp="2026-04-22 18:52:11 +0000 UTC" firstStartedPulling="2026-04-22 18:52:19.365248364 +0000 UTC m=+334.605364444" lastFinishedPulling="2026-04-22 18:52:22.394450389 +0000 UTC m=+337.634566471" observedRunningTime="2026-04-22 18:52:23.362923608 +0000 UTC m=+338.603039710" watchObservedRunningTime="2026-04-22 18:52:23.364135948 +0000 UTC m=+338.604252051" Apr 22 18:52:34.356055 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:52:34.355976 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4h4sg" Apr 22 18:53:20.645003 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.644969 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh"] Apr 22 18:53:20.647133 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.647116 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh" Apr 22 18:53:20.649681 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.649657 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 18:53:20.649681 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.649672 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:53:20.649819 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.649684 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-gcr28\"" Apr 22 18:53:20.657910 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.657889 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh"] Apr 22 18:53:20.707291 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.707264 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dst6\" (UniqueName: \"kubernetes.io/projected/f6ac7109-1ee7-401a-9a91-09d5ab42ed5f-kube-api-access-9dst6\") pod \"cert-manager-operator-controller-manager-54b9655956-2nnwh\" (UID: \"f6ac7109-1ee7-401a-9a91-09d5ab42ed5f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh" Apr 22 18:53:20.707415 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.707306 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6ac7109-1ee7-401a-9a91-09d5ab42ed5f-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-2nnwh\" (UID: \"f6ac7109-1ee7-401a-9a91-09d5ab42ed5f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh" Apr 22 18:53:20.807681 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.807641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6ac7109-1ee7-401a-9a91-09d5ab42ed5f-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-2nnwh\" (UID: \"f6ac7109-1ee7-401a-9a91-09d5ab42ed5f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh" Apr 22 18:53:20.807834 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.807716 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dst6\" (UniqueName: \"kubernetes.io/projected/f6ac7109-1ee7-401a-9a91-09d5ab42ed5f-kube-api-access-9dst6\") pod \"cert-manager-operator-controller-manager-54b9655956-2nnwh\" (UID: \"f6ac7109-1ee7-401a-9a91-09d5ab42ed5f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh" Apr 22 18:53:20.808064 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.808041 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6ac7109-1ee7-401a-9a91-09d5ab42ed5f-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-2nnwh\" (UID: \"f6ac7109-1ee7-401a-9a91-09d5ab42ed5f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh" Apr 22 18:53:20.815330 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.815305 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dst6\" (UniqueName: \"kubernetes.io/projected/f6ac7109-1ee7-401a-9a91-09d5ab42ed5f-kube-api-access-9dst6\") pod \"cert-manager-operator-controller-manager-54b9655956-2nnwh\" (UID: \"f6ac7109-1ee7-401a-9a91-09d5ab42ed5f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh" Apr 22 18:53:20.956561 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:20.956479 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh" Apr 22 18:53:21.079925 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:21.079889 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh"] Apr 22 18:53:21.086412 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:53:21.086374 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6ac7109_1ee7_401a_9a91_09d5ab42ed5f.slice/crio-8435a9db05c7b48121c413ca8917e83ae408cc723ed03aec228d64e0b6fecd1f WatchSource:0}: Error finding container 8435a9db05c7b48121c413ca8917e83ae408cc723ed03aec228d64e0b6fecd1f: Status 404 returned error can't find the container with id 8435a9db05c7b48121c413ca8917e83ae408cc723ed03aec228d64e0b6fecd1f Apr 22 18:53:21.501209 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:21.501154 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh" event={"ID":"f6ac7109-1ee7-401a-9a91-09d5ab42ed5f","Type":"ContainerStarted","Data":"8435a9db05c7b48121c413ca8917e83ae408cc723ed03aec228d64e0b6fecd1f"} Apr 22 18:53:24.512813 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:24.512770 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh" event={"ID":"f6ac7109-1ee7-401a-9a91-09d5ab42ed5f","Type":"ContainerStarted","Data":"05db2ef918f709ba854858f4d7474abd37c6203b36722b4cf1c842ec60e14fb1"} Apr 22 18:53:24.534650 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:24.534605 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-2nnwh" podStartSLOduration=1.953132209 podStartE2EDuration="4.534591328s" podCreationTimestamp="2026-04-22 18:53:20 +0000 UTC" firstStartedPulling="2026-04-22 18:53:21.08930314 +0000 UTC m=+396.329419221" lastFinishedPulling="2026-04-22 18:53:23.670762245 +0000 UTC m=+398.910878340" observedRunningTime="2026-04-22 18:53:24.532263852 +0000 UTC m=+399.772379954" watchObservedRunningTime="2026-04-22 18:53:24.534591328 +0000 UTC m=+399.774707430" Apr 22 18:53:32.489653 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.489615 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-kpqbn"] Apr 22 18:53:32.492006 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.491987 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-kpqbn" Apr 22 18:53:32.494477 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.494453 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 18:53:32.495522 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.495503 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-r7nmn\"" Apr 22 18:53:32.495622 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.495507 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 18:53:32.501423 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.501390 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-kpqbn"] Apr 22 18:53:32.594861 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.594826 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gsz2\" (UniqueName: \"kubernetes.io/projected/d6144cf7-7154-430a-8787-d53f3a59bd8d-kube-api-access-4gsz2\") pod \"cert-manager-cainjector-68b757865b-kpqbn\" (UID: \"d6144cf7-7154-430a-8787-d53f3a59bd8d\") " pod="cert-manager/cert-manager-cainjector-68b757865b-kpqbn" Apr 22 18:53:32.595016 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.594866 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6144cf7-7154-430a-8787-d53f3a59bd8d-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-kpqbn\" (UID: \"d6144cf7-7154-430a-8787-d53f3a59bd8d\") " pod="cert-manager/cert-manager-cainjector-68b757865b-kpqbn" Apr 22 18:53:32.696055 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.696016 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gsz2\" (UniqueName: \"kubernetes.io/projected/d6144cf7-7154-430a-8787-d53f3a59bd8d-kube-api-access-4gsz2\") pod \"cert-manager-cainjector-68b757865b-kpqbn\" (UID: \"d6144cf7-7154-430a-8787-d53f3a59bd8d\") " pod="cert-manager/cert-manager-cainjector-68b757865b-kpqbn" Apr 22 18:53:32.696225 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.696063 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6144cf7-7154-430a-8787-d53f3a59bd8d-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-kpqbn\" (UID: \"d6144cf7-7154-430a-8787-d53f3a59bd8d\") " pod="cert-manager/cert-manager-cainjector-68b757865b-kpqbn" Apr 22 18:53:32.704070 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.704045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6144cf7-7154-430a-8787-d53f3a59bd8d-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-kpqbn\" (UID: \"d6144cf7-7154-430a-8787-d53f3a59bd8d\") " pod="cert-manager/cert-manager-cainjector-68b757865b-kpqbn" Apr 22 18:53:32.704303 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.704285 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gsz2\" (UniqueName: \"kubernetes.io/projected/d6144cf7-7154-430a-8787-d53f3a59bd8d-kube-api-access-4gsz2\") pod \"cert-manager-cainjector-68b757865b-kpqbn\" (UID: \"d6144cf7-7154-430a-8787-d53f3a59bd8d\") " pod="cert-manager/cert-manager-cainjector-68b757865b-kpqbn" Apr 22 18:53:32.800406 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.800335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-kpqbn" Apr 22 18:53:32.920625 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:32.920586 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-kpqbn"] Apr 22 18:53:32.923604 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:53:32.923575 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6144cf7_7154_430a_8787_d53f3a59bd8d.slice/crio-87a72411ddf6b5ee2884cfdea90cc95929ab1d632c765c95296b2d9c4104f08f WatchSource:0}: Error finding container 87a72411ddf6b5ee2884cfdea90cc95929ab1d632c765c95296b2d9c4104f08f: Status 404 returned error can't find the container with id 87a72411ddf6b5ee2884cfdea90cc95929ab1d632c765c95296b2d9c4104f08f Apr 22 18:53:33.540234 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:33.540200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-kpqbn" event={"ID":"d6144cf7-7154-430a-8787-d53f3a59bd8d","Type":"ContainerStarted","Data":"87a72411ddf6b5ee2884cfdea90cc95929ab1d632c765c95296b2d9c4104f08f"} Apr 22 18:53:36.550760 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:36.550727 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-kpqbn" event={"ID":"d6144cf7-7154-430a-8787-d53f3a59bd8d","Type":"ContainerStarted","Data":"b7b1882ff50b634dafafdb83d32da074c5d327c8f3b95d610125fb972abafd75"} Apr 22 18:53:36.569489 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:36.565712 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-kpqbn" podStartSLOduration=1.529347526 podStartE2EDuration="4.565694036s" podCreationTimestamp="2026-04-22 18:53:32 +0000 UTC" firstStartedPulling="2026-04-22 18:53:32.925798568 +0000 UTC m=+408.165914648" lastFinishedPulling="2026-04-22 18:53:35.962145065 +0000 UTC m=+411.202261158" observedRunningTime="2026-04-22 18:53:36.563514442 +0000 UTC m=+411.803630544" watchObservedRunningTime="2026-04-22 18:53:36.565694036 +0000 UTC m=+411.805810141" Apr 22 18:53:44.052916 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.052879 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch"] Apr 22 18:53:44.055363 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.055343 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch" Apr 22 18:53:44.057713 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.057688 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-rd2v5\"" Apr 22 18:53:44.057802 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.057730 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:53:44.058806 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.058785 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:53:44.063810 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.063779 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch"] Apr 22 18:53:44.186395 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.186365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43ebccbc-000c-4ebe-97b0-7c1fc69bebd5-tmp\") pod \"openshift-lws-operator-bfc7f696d-bhbch\" (UID: \"43ebccbc-000c-4ebe-97b0-7c1fc69bebd5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch" Apr 22 18:53:44.186560 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.186420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfdkt\" (UniqueName: \"kubernetes.io/projected/43ebccbc-000c-4ebe-97b0-7c1fc69bebd5-kube-api-access-bfdkt\") pod \"openshift-lws-operator-bfc7f696d-bhbch\" (UID: \"43ebccbc-000c-4ebe-97b0-7c1fc69bebd5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch" Apr 22 18:53:44.287050 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.287021 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43ebccbc-000c-4ebe-97b0-7c1fc69bebd5-tmp\") pod \"openshift-lws-operator-bfc7f696d-bhbch\" (UID: \"43ebccbc-000c-4ebe-97b0-7c1fc69bebd5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch" Apr 22 18:53:44.287225 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.287077 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfdkt\" (UniqueName: \"kubernetes.io/projected/43ebccbc-000c-4ebe-97b0-7c1fc69bebd5-kube-api-access-bfdkt\") pod \"openshift-lws-operator-bfc7f696d-bhbch\" (UID: \"43ebccbc-000c-4ebe-97b0-7c1fc69bebd5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch" Apr 22 18:53:44.287417 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.287399 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43ebccbc-000c-4ebe-97b0-7c1fc69bebd5-tmp\") pod \"openshift-lws-operator-bfc7f696d-bhbch\" (UID: \"43ebccbc-000c-4ebe-97b0-7c1fc69bebd5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch" Apr 22 18:53:44.295416 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.295397 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfdkt\" (UniqueName: \"kubernetes.io/projected/43ebccbc-000c-4ebe-97b0-7c1fc69bebd5-kube-api-access-bfdkt\") pod \"openshift-lws-operator-bfc7f696d-bhbch\" (UID: \"43ebccbc-000c-4ebe-97b0-7c1fc69bebd5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch" Apr 22 18:53:44.365612 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.365556 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch" Apr 22 18:53:44.483771 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.483740 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch"] Apr 22 18:53:44.487276 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:53:44.487249 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43ebccbc_000c_4ebe_97b0_7c1fc69bebd5.slice/crio-1caf2ff54892844f7526cd8ecfdb00443dc7df661a1f4246a2c7f902966abdde WatchSource:0}: Error finding container 1caf2ff54892844f7526cd8ecfdb00443dc7df661a1f4246a2c7f902966abdde: Status 404 returned error can't find the container with id 1caf2ff54892844f7526cd8ecfdb00443dc7df661a1f4246a2c7f902966abdde Apr 22 18:53:44.576225 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:44.576190 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch" event={"ID":"43ebccbc-000c-4ebe-97b0-7c1fc69bebd5","Type":"ContainerStarted","Data":"1caf2ff54892844f7526cd8ecfdb00443dc7df661a1f4246a2c7f902966abdde"} Apr 22 18:53:47.587694 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:47.587603 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch" event={"ID":"43ebccbc-000c-4ebe-97b0-7c1fc69bebd5","Type":"ContainerStarted","Data":"e0ee9f95d03a7d7bfa4eda86267cc4ebf32b3eaa4e52494c5241d5f8f8205c0d"} Apr 22 18:53:47.602567 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:53:47.602524 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bhbch" podStartSLOduration=0.879349451 podStartE2EDuration="3.602510445s" podCreationTimestamp="2026-04-22 18:53:44 +0000 UTC" firstStartedPulling="2026-04-22 18:53:44.489211725 +0000 UTC m=+419.729327804" lastFinishedPulling="2026-04-22 18:53:47.212372701 +0000 UTC m=+422.452488798" observedRunningTime="2026-04-22 18:53:47.600757701 +0000 UTC m=+422.840873804" watchObservedRunningTime="2026-04-22 18:53:47.602510445 +0000 UTC m=+422.842626547" Apr 22 18:54:21.430477 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.430446 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h"] Apr 22 18:54:21.434859 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.434838 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.437374 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.437352 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 18:54:21.437505 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.437474 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:54:21.437571 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.437501 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 18:54:21.437571 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.437481 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-l2zpt\"" Apr 22 18:54:21.437671 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.437604 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 18:54:21.437754 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.437740 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 18:54:21.437802 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.437781 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:54:21.443349 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.443327 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h"] Apr 22 18:54:21.573492 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.573463 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.573660 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.573499 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhbn2\" (UniqueName: \"kubernetes.io/projected/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-kube-api-access-bhbn2\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.573660 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.573517 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.573660 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.573618 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.573847 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.573662 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.573847 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.573709 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.573847 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.573747 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.674926 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.674899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.675090 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.674935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.675090 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.674955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.675090 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.674972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.675090 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.675032 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.675090 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.675061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhbn2\" (UniqueName: \"kubernetes.io/projected/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-kube-api-access-bhbn2\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.675090 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.675085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.675689 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.675659 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.677519 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.677492 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.677702 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.677678 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.677774 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.677718 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.677774 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.677759 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.682932 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.682877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.683152 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.683132 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhbn2\" (UniqueName: \"kubernetes.io/projected/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-kube-api-access-bhbn2\") pod \"istiod-openshift-gateway-7cd77c7ffd-gwz9h\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.744692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.744673 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:21.869160 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:21.869132 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h"] Apr 22 18:54:21.871582 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:54:21.871555 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod333a7495_eca5_4a6b_ad2d_1248cceeb2f1.slice/crio-dd00455a2052814fb3bba612037088e31a0b4f71057f5db334920a23e285d7e1 WatchSource:0}: Error finding container dd00455a2052814fb3bba612037088e31a0b4f71057f5db334920a23e285d7e1: Status 404 returned error can't find the container with id dd00455a2052814fb3bba612037088e31a0b4f71057f5db334920a23e285d7e1 Apr 22 18:54:22.703669 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:22.703631 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" event={"ID":"333a7495-eca5-4a6b-ad2d-1248cceeb2f1","Type":"ContainerStarted","Data":"dd00455a2052814fb3bba612037088e31a0b4f71057f5db334920a23e285d7e1"} Apr 22 18:54:25.144240 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:25.144200 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892152Ki","pods":"250"} Apr 22 18:54:25.144484 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:25.144268 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892152Ki","pods":"250"} Apr 22 18:54:25.716638 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:25.716525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" event={"ID":"333a7495-eca5-4a6b-ad2d-1248cceeb2f1","Type":"ContainerStarted","Data":"4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567"} Apr 22 18:54:25.716819 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:25.716750 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:25.741268 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:25.741211 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" podStartSLOduration=1.4707461849999999 podStartE2EDuration="4.741192923s" podCreationTimestamp="2026-04-22 18:54:21 +0000 UTC" firstStartedPulling="2026-04-22 18:54:21.873533093 +0000 UTC m=+457.113649173" lastFinishedPulling="2026-04-22 18:54:25.143979826 +0000 UTC m=+460.384095911" observedRunningTime="2026-04-22 18:54:25.739238701 +0000 UTC m=+460.979354795" watchObservedRunningTime="2026-04-22 18:54:25.741192923 +0000 UTC m=+460.981309021" Apr 22 18:54:26.722987 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:26.722962 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:54:50.210539 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:50.210501 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5"] Apr 22 18:54:50.213720 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:50.213700 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5" Apr 22 18:54:50.216668 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:50.216648 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:54:50.216763 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:50.216648 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:54:50.217859 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:50.217830 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-4p9dm\"" Apr 22 18:54:50.229713 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:50.229690 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5"] Apr 22 18:54:50.275365 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:50.275328 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw69b\" (UniqueName: \"kubernetes.io/projected/c58e17ad-8347-4144-bc44-fdfebd91e686-kube-api-access-mw69b\") pod \"limitador-operator-controller-manager-c7fb4c8d5-xchc5\" (UID: \"c58e17ad-8347-4144-bc44-fdfebd91e686\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5" Apr 22 18:54:50.376093 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:50.376056 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mw69b\" (UniqueName: \"kubernetes.io/projected/c58e17ad-8347-4144-bc44-fdfebd91e686-kube-api-access-mw69b\") pod \"limitador-operator-controller-manager-c7fb4c8d5-xchc5\" (UID: \"c58e17ad-8347-4144-bc44-fdfebd91e686\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5" Apr 22 18:54:50.386714 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:50.386681 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw69b\" (UniqueName: \"kubernetes.io/projected/c58e17ad-8347-4144-bc44-fdfebd91e686-kube-api-access-mw69b\") pod \"limitador-operator-controller-manager-c7fb4c8d5-xchc5\" (UID: \"c58e17ad-8347-4144-bc44-fdfebd91e686\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5" Apr 22 18:54:50.523686 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:50.523596 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5" Apr 22 18:54:50.651514 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:50.651456 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5"] Apr 22 18:54:50.654448 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:54:50.654423 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc58e17ad_8347_4144_bc44_fdfebd91e686.slice/crio-a27060d514fe4a3cbe58901fd6183913f98c56993a1760244b80b348403e7618 WatchSource:0}: Error finding container a27060d514fe4a3cbe58901fd6183913f98c56993a1760244b80b348403e7618: Status 404 returned error can't find the container with id a27060d514fe4a3cbe58901fd6183913f98c56993a1760244b80b348403e7618 Apr 22 18:54:50.796775 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:50.796675 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5" event={"ID":"c58e17ad-8347-4144-bc44-fdfebd91e686","Type":"ContainerStarted","Data":"a27060d514fe4a3cbe58901fd6183913f98c56993a1760244b80b348403e7618"} Apr 22 18:54:54.811916 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:54.811883 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5" event={"ID":"c58e17ad-8347-4144-bc44-fdfebd91e686","Type":"ContainerStarted","Data":"81bed89d0e873ecc9059e1c1997c8b342bc797c19eb56a3fc3c21bd84cc7be93"} Apr 22 18:54:54.812285 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:54.812018 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5" Apr 22 18:54:54.831618 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:54.831579 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5" podStartSLOduration=1.717736628 podStartE2EDuration="4.831566265s" podCreationTimestamp="2026-04-22 18:54:50 +0000 UTC" firstStartedPulling="2026-04-22 18:54:50.656865534 +0000 UTC m=+485.896981614" lastFinishedPulling="2026-04-22 18:54:53.770695171 +0000 UTC m=+489.010811251" observedRunningTime="2026-04-22 18:54:54.830442386 +0000 UTC m=+490.070558482" watchObservedRunningTime="2026-04-22 18:54:54.831566265 +0000 UTC m=+490.071682366" Apr 22 18:54:57.118597 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:57.118566 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-hz6x8"] Apr 22 18:54:57.121889 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:57.121868 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-hz6x8" Apr 22 18:54:57.124465 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:57.124448 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-zgpd7\"" Apr 22 18:54:57.133651 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:57.133600 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-hz6x8"] Apr 22 18:54:57.231860 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:57.231823 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvtbx\" (UniqueName: \"kubernetes.io/projected/18c72311-fe1f-4302-81e4-3b4f20fc0097-kube-api-access-zvtbx\") pod \"authorino-operator-7587b89b76-hz6x8\" (UID: \"18c72311-fe1f-4302-81e4-3b4f20fc0097\") " pod="kuadrant-system/authorino-operator-7587b89b76-hz6x8" Apr 22 18:54:57.333054 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:57.333016 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvtbx\" (UniqueName: \"kubernetes.io/projected/18c72311-fe1f-4302-81e4-3b4f20fc0097-kube-api-access-zvtbx\") pod \"authorino-operator-7587b89b76-hz6x8\" (UID: \"18c72311-fe1f-4302-81e4-3b4f20fc0097\") " pod="kuadrant-system/authorino-operator-7587b89b76-hz6x8" Apr 22 18:54:57.344131 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:57.344099 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvtbx\" (UniqueName: \"kubernetes.io/projected/18c72311-fe1f-4302-81e4-3b4f20fc0097-kube-api-access-zvtbx\") pod \"authorino-operator-7587b89b76-hz6x8\" (UID: \"18c72311-fe1f-4302-81e4-3b4f20fc0097\") " pod="kuadrant-system/authorino-operator-7587b89b76-hz6x8" Apr 22 18:54:57.432922 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:57.432882 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-hz6x8" Apr 22 18:54:57.563527 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:57.563493 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-hz6x8"] Apr 22 18:54:57.567685 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:54:57.567650 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c72311_fe1f_4302_81e4_3b4f20fc0097.slice/crio-6f5644af1d62616b408091448f1b249dddb9d0f7cf3848826e823f88b6baac92 WatchSource:0}: Error finding container 6f5644af1d62616b408091448f1b249dddb9d0f7cf3848826e823f88b6baac92: Status 404 returned error can't find the container with id 6f5644af1d62616b408091448f1b249dddb9d0f7cf3848826e823f88b6baac92 Apr 22 18:54:57.822682 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:57.822598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-hz6x8" event={"ID":"18c72311-fe1f-4302-81e4-3b4f20fc0097","Type":"ContainerStarted","Data":"6f5644af1d62616b408091448f1b249dddb9d0f7cf3848826e823f88b6baac92"} Apr 22 18:54:59.831648 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:59.831611 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-hz6x8" event={"ID":"18c72311-fe1f-4302-81e4-3b4f20fc0097","Type":"ContainerStarted","Data":"925ee41229f35fb90a79d8dbb37eb3fc83ebd470879c0e3b92d43593399807ee"} Apr 22 18:54:59.832138 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:59.831857 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-hz6x8" Apr 22 18:54:59.852666 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:54:59.852612 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-hz6x8" podStartSLOduration=1.364168525 podStartE2EDuration="2.852595501s" podCreationTimestamp="2026-04-22 18:54:57 +0000 UTC" firstStartedPulling="2026-04-22 18:54:57.57028983 +0000 UTC m=+492.810405912" lastFinishedPulling="2026-04-22 18:54:59.058716808 +0000 UTC m=+494.298832888" observedRunningTime="2026-04-22 18:54:59.851749291 +0000 UTC m=+495.091865393" watchObservedRunningTime="2026-04-22 18:54:59.852595501 +0000 UTC m=+495.092711583" Apr 22 18:55:01.282396 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.282360 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm"] Apr 22 18:55:01.285594 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.285577 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" Apr 22 18:55:01.289325 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.289303 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-465px\"" Apr 22 18:55:01.299592 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.299572 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm"] Apr 22 18:55:01.369149 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.369120 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdkrx\" (UniqueName: \"kubernetes.io/projected/9b1066c7-5168-4d2c-80f3-4994f76a6be0-kube-api-access-mdkrx\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-xxvgm\" (UID: \"9b1066c7-5168-4d2c-80f3-4994f76a6be0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" Apr 22 18:55:01.369297 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.369184 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b1066c7-5168-4d2c-80f3-4994f76a6be0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-xxvgm\" (UID: \"9b1066c7-5168-4d2c-80f3-4994f76a6be0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" Apr 22 18:55:01.470438 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.470409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkrx\" (UniqueName: \"kubernetes.io/projected/9b1066c7-5168-4d2c-80f3-4994f76a6be0-kube-api-access-mdkrx\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-xxvgm\" (UID: \"9b1066c7-5168-4d2c-80f3-4994f76a6be0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" Apr 22 18:55:01.470571 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.470467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b1066c7-5168-4d2c-80f3-4994f76a6be0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-xxvgm\" (UID: \"9b1066c7-5168-4d2c-80f3-4994f76a6be0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" Apr 22 18:55:01.470891 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.470868 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b1066c7-5168-4d2c-80f3-4994f76a6be0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-xxvgm\" (UID: \"9b1066c7-5168-4d2c-80f3-4994f76a6be0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" Apr 22 18:55:01.478610 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.478589 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdkrx\" (UniqueName: \"kubernetes.io/projected/9b1066c7-5168-4d2c-80f3-4994f76a6be0-kube-api-access-mdkrx\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-xxvgm\" (UID: \"9b1066c7-5168-4d2c-80f3-4994f76a6be0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" Apr 22 18:55:01.596367 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.596286 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" Apr 22 18:55:01.718084 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.718053 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm"] Apr 22 18:55:01.721257 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:55:01.721229 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b1066c7_5168_4d2c_80f3_4994f76a6be0.slice/crio-9a7a06c34c541bd5ccc5bfe2240c5e38ba9d1c8ab02768bae07dea8efbbb1e7b WatchSource:0}: Error finding container 9a7a06c34c541bd5ccc5bfe2240c5e38ba9d1c8ab02768bae07dea8efbbb1e7b: Status 404 returned error can't find the container with id 9a7a06c34c541bd5ccc5bfe2240c5e38ba9d1c8ab02768bae07dea8efbbb1e7b Apr 22 18:55:01.844358 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:01.844324 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" event={"ID":"9b1066c7-5168-4d2c-80f3-4994f76a6be0","Type":"ContainerStarted","Data":"9a7a06c34c541bd5ccc5bfe2240c5e38ba9d1c8ab02768bae07dea8efbbb1e7b"} Apr 22 18:55:05.818035 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:05.818003 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xchc5" Apr 22 18:55:06.862606 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:06.862512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" event={"ID":"9b1066c7-5168-4d2c-80f3-4994f76a6be0","Type":"ContainerStarted","Data":"3c1e04f3706c22083b5f06b27ad5227bb03ee52bcb67198ca2228326822db61d"} Apr 22 18:55:06.862998 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:06.862607 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" Apr 22 18:55:06.885737 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:06.885685 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" podStartSLOduration=1.074222541 podStartE2EDuration="5.885671029s" podCreationTimestamp="2026-04-22 18:55:01 +0000 UTC" firstStartedPulling="2026-04-22 18:55:01.723612038 +0000 UTC m=+496.963728118" lastFinishedPulling="2026-04-22 18:55:06.535060509 +0000 UTC m=+501.775176606" observedRunningTime="2026-04-22 18:55:06.883454173 +0000 UTC m=+502.123570291" watchObservedRunningTime="2026-04-22 18:55:06.885671029 +0000 UTC m=+502.125787130" Apr 22 18:55:10.836464 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:10.836436 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-hz6x8" Apr 22 18:55:17.868141 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:55:17.868107 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xxvgm" Apr 22 18:56:24.389208 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.389157 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm"] Apr 22 18:56:24.392213 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.392197 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.404356 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.404332 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm"] Apr 22 18:56:24.532540 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.532506 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdmzh\" (UniqueName: \"kubernetes.io/projected/a4eb3066-1ffd-4e80-8482-976812d9d008-kube-api-access-rdmzh\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.532540 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.532546 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a4eb3066-1ffd-4e80-8482-976812d9d008-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.532748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.532565 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a4eb3066-1ffd-4e80-8482-976812d9d008-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.532748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.532680 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a4eb3066-1ffd-4e80-8482-976812d9d008-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.532748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.532713 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a4eb3066-1ffd-4e80-8482-976812d9d008-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.532748 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.532734 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a4eb3066-1ffd-4e80-8482-976812d9d008-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.532874 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.532755 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a4eb3066-1ffd-4e80-8482-976812d9d008-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.633187 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.633144 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a4eb3066-1ffd-4e80-8482-976812d9d008-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.633345 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.633204 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a4eb3066-1ffd-4e80-8482-976812d9d008-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.633345 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.633234 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a4eb3066-1ffd-4e80-8482-976812d9d008-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.633345 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.633261 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a4eb3066-1ffd-4e80-8482-976812d9d008-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.633345 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.633304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdmzh\" (UniqueName: \"kubernetes.io/projected/a4eb3066-1ffd-4e80-8482-976812d9d008-kube-api-access-rdmzh\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.633345 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.633339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a4eb3066-1ffd-4e80-8482-976812d9d008-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.633676 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.633551 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a4eb3066-1ffd-4e80-8482-976812d9d008-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.634052 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.634012 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a4eb3066-1ffd-4e80-8482-976812d9d008-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.635798 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.635770 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a4eb3066-1ffd-4e80-8482-976812d9d008-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.635926 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.635910 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a4eb3066-1ffd-4e80-8482-976812d9d008-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.636215 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.636196 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a4eb3066-1ffd-4e80-8482-976812d9d008-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.636314 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.636283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a4eb3066-1ffd-4e80-8482-976812d9d008-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.641501 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.641459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a4eb3066-1ffd-4e80-8482-976812d9d008-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.641999 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.641977 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdmzh\" (UniqueName: \"kubernetes.io/projected/a4eb3066-1ffd-4e80-8482-976812d9d008-kube-api-access-rdmzh\") pod \"istiod-openshift-gateway-55ff986f96-m6qfm\" (UID: \"a4eb3066-1ffd-4e80-8482-976812d9d008\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.701955 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.701932 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:24.828529 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.825932 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm"] Apr 22 18:56:24.830572 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.830535 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892152Ki","pods":"250"} Apr 22 18:56:24.830681 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:24.830601 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892152Ki","pods":"250"} Apr 22 18:56:25.111490 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:25.111395 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" event={"ID":"a4eb3066-1ffd-4e80-8482-976812d9d008","Type":"ContainerStarted","Data":"c767b4f10144a5d6145d0c5ff32359169264b9f8778e246e31aa1fe4663eceec"} Apr 22 18:56:25.111490 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:25.111441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" event={"ID":"a4eb3066-1ffd-4e80-8482-976812d9d008","Type":"ContainerStarted","Data":"de7d07d07821aeebba797e60b4085d4ed45a9951f9d64bca4e3ffce479c0b0a2"} Apr 22 18:56:25.111709 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:25.111601 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:25.113124 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:25.113098 2573 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-m6qfm container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 18:56:25.113251 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:25.113145 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" podUID="a4eb3066-1ffd-4e80-8482-976812d9d008" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:56:25.133896 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:25.133839 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" podStartSLOduration=1.13382425 podStartE2EDuration="1.13382425s" podCreationTimestamp="2026-04-22 18:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:56:25.133452153 +0000 UTC m=+580.373568259" watchObservedRunningTime="2026-04-22 18:56:25.13382425 +0000 UTC m=+580.373940352" Apr 22 18:56:26.117336 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:26.117304 2573 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-m6qfm container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 18:56:26.117891 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:26.117858 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" podUID="a4eb3066-1ffd-4e80-8482-976812d9d008" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:56:29.117993 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.117960 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m6qfm" Apr 22 18:56:29.192802 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.192767 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h"] Apr 22 18:56:29.193007 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.192986 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" podUID="333a7495-eca5-4a6b-ad2d-1248cceeb2f1" containerName="discovery" containerID="cri-o://4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567" gracePeriod=30 Apr 22 18:56:29.435385 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.435360 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:56:29.577203 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.577148 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-kubeconfig\") pod \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " Apr 22 18:56:29.577368 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.577226 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-csr-dns-cert\") pod \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " Apr 22 18:56:29.577368 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.577293 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhbn2\" (UniqueName: \"kubernetes.io/projected/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-kube-api-access-bhbn2\") pod \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " Apr 22 18:56:29.577368 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.577334 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-cacerts\") pod \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " Apr 22 18:56:29.577368 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.577358 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-csr-ca-configmap\") pod \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " Apr 22 18:56:29.577581 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.577405 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-local-certs\") pod \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " Apr 22 18:56:29.577581 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.577433 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-token\") pod \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\" (UID: \"333a7495-eca5-4a6b-ad2d-1248cceeb2f1\") " Apr 22 18:56:29.577833 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.577799 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "333a7495-eca5-4a6b-ad2d-1248cceeb2f1" (UID: "333a7495-eca5-4a6b-ad2d-1248cceeb2f1"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:56:29.579845 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.579819 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "333a7495-eca5-4a6b-ad2d-1248cceeb2f1" (UID: "333a7495-eca5-4a6b-ad2d-1248cceeb2f1"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:56:29.579845 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.579835 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-cacerts" (OuterVolumeSpecName: "cacerts") pod "333a7495-eca5-4a6b-ad2d-1248cceeb2f1" (UID: "333a7495-eca5-4a6b-ad2d-1248cceeb2f1"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:56:29.580228 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.580196 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-local-certs" (OuterVolumeSpecName: "local-certs") pod "333a7495-eca5-4a6b-ad2d-1248cceeb2f1" (UID: "333a7495-eca5-4a6b-ad2d-1248cceeb2f1"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:56:29.580315 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.580238 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "333a7495-eca5-4a6b-ad2d-1248cceeb2f1" (UID: "333a7495-eca5-4a6b-ad2d-1248cceeb2f1"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:56:29.580315 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.580250 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-token" (OuterVolumeSpecName: "istio-token") pod "333a7495-eca5-4a6b-ad2d-1248cceeb2f1" (UID: "333a7495-eca5-4a6b-ad2d-1248cceeb2f1"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:56:29.580315 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.580243 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-kube-api-access-bhbn2" (OuterVolumeSpecName: "kube-api-access-bhbn2") pod "333a7495-eca5-4a6b-ad2d-1248cceeb2f1" (UID: "333a7495-eca5-4a6b-ad2d-1248cceeb2f1"). InnerVolumeSpecName "kube-api-access-bhbn2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:56:29.678529 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.678472 2573 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-local-certs\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:56:29.678529 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.678495 2573 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-token\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:56:29.678529 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.678504 2573 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-kubeconfig\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:56:29.678529 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.678513 2573 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-csr-dns-cert\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:56:29.678529 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.678522 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bhbn2\" (UniqueName: \"kubernetes.io/projected/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-kube-api-access-bhbn2\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:56:29.678529 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.678532 2573 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-cacerts\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:56:29.678783 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:29.678541 2573 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/333a7495-eca5-4a6b-ad2d-1248cceeb2f1-istio-csr-ca-configmap\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:56:30.135359 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:30.135324 2573 generic.go:358] "Generic (PLEG): container finished" podID="333a7495-eca5-4a6b-ad2d-1248cceeb2f1" containerID="4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567" exitCode=0 Apr 22 18:56:30.135764 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:30.135387 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" Apr 22 18:56:30.135764 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:30.135413 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" event={"ID":"333a7495-eca5-4a6b-ad2d-1248cceeb2f1","Type":"ContainerDied","Data":"4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567"} Apr 22 18:56:30.135764 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:30.135460 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h" event={"ID":"333a7495-eca5-4a6b-ad2d-1248cceeb2f1","Type":"ContainerDied","Data":"dd00455a2052814fb3bba612037088e31a0b4f71057f5db334920a23e285d7e1"} Apr 22 18:56:30.135764 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:30.135475 2573 scope.go:117] "RemoveContainer" containerID="4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567" Apr 22 18:56:30.144375 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:30.144359 2573 scope.go:117] "RemoveContainer" containerID="4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567" Apr 22 18:56:30.144617 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:56:30.144595 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567\": container with ID starting with 4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567 not found: ID does not exist" containerID="4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567" Apr 22 18:56:30.144716 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:30.144623 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567"} err="failed to get container status \"4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567\": rpc error: code = NotFound desc = could not find container \"4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567\": container with ID starting with 4b3406a87714764258a52d298412d3240248d74eda5a0d74ff318ccf3184a567 not found: ID does not exist" Apr 22 18:56:30.154470 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:30.154447 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h"] Apr 22 18:56:30.157839 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:30.157818 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gwz9h"] Apr 22 18:56:31.363678 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:31.363635 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="333a7495-eca5-4a6b-ad2d-1248cceeb2f1" path="/var/lib/kubelet/pods/333a7495-eca5-4a6b-ad2d-1248cceeb2f1/volumes" Apr 22 18:56:31.997085 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:31.997048 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-74f8bc794f-ng6xl"] Apr 22 18:56:31.997389 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:31.997377 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="333a7495-eca5-4a6b-ad2d-1248cceeb2f1" containerName="discovery" Apr 22 18:56:31.997435 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:31.997392 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="333a7495-eca5-4a6b-ad2d-1248cceeb2f1" containerName="discovery" Apr 22 18:56:31.997468 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:31.997444 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="333a7495-eca5-4a6b-ad2d-1248cceeb2f1" containerName="discovery" Apr 22 18:56:32.001739 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.001718 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" Apr 22 18:56:32.005298 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.005278 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-nlb48\"" Apr 22 18:56:32.005416 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.005321 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:56:32.005416 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.005280 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:56:32.005416 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.005402 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 18:56:32.009294 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.009196 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-74f8bc794f-ng6xl"] Apr 22 18:56:32.097483 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.097442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gwhv\" (UniqueName: \"kubernetes.io/projected/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c-kube-api-access-9gwhv\") pod \"llmisvc-controller-manager-74f8bc794f-ng6xl\" (UID: \"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c\") " pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" Apr 22 18:56:32.097483 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.097483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c-cert\") pod \"llmisvc-controller-manager-74f8bc794f-ng6xl\" (UID: \"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c\") " pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" Apr 22 18:56:32.198527 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.198490 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c-cert\") pod \"llmisvc-controller-manager-74f8bc794f-ng6xl\" (UID: \"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c\") " pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" Apr 22 18:56:32.198689 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.198577 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gwhv\" (UniqueName: \"kubernetes.io/projected/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c-kube-api-access-9gwhv\") pod \"llmisvc-controller-manager-74f8bc794f-ng6xl\" (UID: \"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c\") " pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" Apr 22 18:56:32.201008 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.200989 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c-cert\") pod \"llmisvc-controller-manager-74f8bc794f-ng6xl\" (UID: \"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c\") " pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" Apr 22 18:56:32.210867 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.210842 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gwhv\" (UniqueName: \"kubernetes.io/projected/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c-kube-api-access-9gwhv\") pod \"llmisvc-controller-manager-74f8bc794f-ng6xl\" (UID: \"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c\") " pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" Apr 22 18:56:32.312775 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.312689 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" Apr 22 18:56:32.436016 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:32.435981 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-74f8bc794f-ng6xl"] Apr 22 18:56:32.438838 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:56:32.438807 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd9311ef2_e4cd_4cf7_bee6_326c98b3ef2c.slice/crio-dd321f73edf11399383db2b590f10938dd374b3049cedf06e8c919b01931e6fe WatchSource:0}: Error finding container dd321f73edf11399383db2b590f10938dd374b3049cedf06e8c919b01931e6fe: Status 404 returned error can't find the container with id dd321f73edf11399383db2b590f10938dd374b3049cedf06e8c919b01931e6fe Apr 22 18:56:33.146747 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:33.146713 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" event={"ID":"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c","Type":"ContainerStarted","Data":"dd321f73edf11399383db2b590f10938dd374b3049cedf06e8c919b01931e6fe"} Apr 22 18:56:37.164637 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:37.164603 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" event={"ID":"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c","Type":"ContainerStarted","Data":"85a642165f204157f5646f999372a88d0d9712cf29b2a18d2768f2ec312a56ed"} Apr 22 18:56:37.165001 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:37.164657 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" Apr 22 18:56:37.180058 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:37.180014 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" podStartSLOduration=2.586486227 podStartE2EDuration="6.179999078s" podCreationTimestamp="2026-04-22 18:56:31 +0000 UTC" firstStartedPulling="2026-04-22 18:56:32.440039515 +0000 UTC m=+587.680155596" lastFinishedPulling="2026-04-22 18:56:36.033552364 +0000 UTC m=+591.273668447" observedRunningTime="2026-04-22 18:56:37.178711113 +0000 UTC m=+592.418827217" watchObservedRunningTime="2026-04-22 18:56:37.179999078 +0000 UTC m=+592.420115180" Apr 22 18:56:45.309905 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:45.309883 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 18:56:45.310191 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:56:45.309960 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 18:57:08.169944 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:08.169864 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" Apr 22 18:57:43.034669 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.034635 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-wtfk9"] Apr 22 18:57:43.038099 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.038075 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-wtfk9" Apr 22 18:57:43.040450 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.040432 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 18:57:43.040531 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.040483 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-8hvpq\"" Apr 22 18:57:43.043751 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.043727 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-wtfk9"] Apr 22 18:57:43.128880 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.128849 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4858a3cd-5f36-4406-9da0-7fe15c84d8b5-tls-certs\") pod \"model-serving-api-86f7b4b499-wtfk9\" (UID: \"4858a3cd-5f36-4406-9da0-7fe15c84d8b5\") " pod="kserve/model-serving-api-86f7b4b499-wtfk9" Apr 22 18:57:43.129019 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.128886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl8s9\" (UniqueName: \"kubernetes.io/projected/4858a3cd-5f36-4406-9da0-7fe15c84d8b5-kube-api-access-fl8s9\") pod \"model-serving-api-86f7b4b499-wtfk9\" (UID: \"4858a3cd-5f36-4406-9da0-7fe15c84d8b5\") " pod="kserve/model-serving-api-86f7b4b499-wtfk9" Apr 22 18:57:43.229860 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.229828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4858a3cd-5f36-4406-9da0-7fe15c84d8b5-tls-certs\") pod \"model-serving-api-86f7b4b499-wtfk9\" (UID: \"4858a3cd-5f36-4406-9da0-7fe15c84d8b5\") " pod="kserve/model-serving-api-86f7b4b499-wtfk9" Apr 22 18:57:43.230032 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.229877 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl8s9\" (UniqueName: \"kubernetes.io/projected/4858a3cd-5f36-4406-9da0-7fe15c84d8b5-kube-api-access-fl8s9\") pod \"model-serving-api-86f7b4b499-wtfk9\" (UID: \"4858a3cd-5f36-4406-9da0-7fe15c84d8b5\") " pod="kserve/model-serving-api-86f7b4b499-wtfk9" Apr 22 18:57:43.232584 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.232560 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4858a3cd-5f36-4406-9da0-7fe15c84d8b5-tls-certs\") pod \"model-serving-api-86f7b4b499-wtfk9\" (UID: \"4858a3cd-5f36-4406-9da0-7fe15c84d8b5\") " pod="kserve/model-serving-api-86f7b4b499-wtfk9" Apr 22 18:57:43.237137 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.237115 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl8s9\" (UniqueName: \"kubernetes.io/projected/4858a3cd-5f36-4406-9da0-7fe15c84d8b5-kube-api-access-fl8s9\") pod \"model-serving-api-86f7b4b499-wtfk9\" (UID: \"4858a3cd-5f36-4406-9da0-7fe15c84d8b5\") " pod="kserve/model-serving-api-86f7b4b499-wtfk9" Apr 22 18:57:43.349240 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.349127 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-wtfk9" Apr 22 18:57:43.481384 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.481358 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-wtfk9"] Apr 22 18:57:43.484005 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:57:43.483964 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4858a3cd_5f36_4406_9da0_7fe15c84d8b5.slice/crio-546744cdc6d702b32a1618fadeb28ad87a1c0b3acbef1b5b000f780fcd55d016 WatchSource:0}: Error finding container 546744cdc6d702b32a1618fadeb28ad87a1c0b3acbef1b5b000f780fcd55d016: Status 404 returned error can't find the container with id 546744cdc6d702b32a1618fadeb28ad87a1c0b3acbef1b5b000f780fcd55d016 Apr 22 18:57:43.485901 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:43.485879 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:57:44.387912 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:44.387876 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-wtfk9" event={"ID":"4858a3cd-5f36-4406-9da0-7fe15c84d8b5","Type":"ContainerStarted","Data":"546744cdc6d702b32a1618fadeb28ad87a1c0b3acbef1b5b000f780fcd55d016"} Apr 22 18:57:46.395377 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:46.395340 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-wtfk9" event={"ID":"4858a3cd-5f36-4406-9da0-7fe15c84d8b5","Type":"ContainerStarted","Data":"63d302968a22151dccaa8dd59da73600c20f596fad9cbefef7287fe2a356907a"} Apr 22 18:57:46.395830 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:46.395464 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-wtfk9" Apr 22 18:57:46.410678 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:46.410631 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-wtfk9" podStartSLOduration=1.126480381 podStartE2EDuration="3.410618381s" podCreationTimestamp="2026-04-22 18:57:43 +0000 UTC" firstStartedPulling="2026-04-22 18:57:43.486055666 +0000 UTC m=+658.726171747" lastFinishedPulling="2026-04-22 18:57:45.770193666 +0000 UTC m=+661.010309747" observedRunningTime="2026-04-22 18:57:46.409189302 +0000 UTC m=+661.649305394" watchObservedRunningTime="2026-04-22 18:57:46.410618381 +0000 UTC m=+661.650734482" Apr 22 18:57:57.403186 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:57:57.403136 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-wtfk9" Apr 22 18:58:32.256928 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.256889 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd"] Apr 22 18:58:32.261832 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.261799 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.264498 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.264467 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-cpfj2\"" Apr 22 18:58:32.264636 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.264478 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:58:32.264636 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.264546 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:58:32.264636 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.264582 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mhhb\"" Apr 22 18:58:32.265528 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.265502 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 18:58:32.270706 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.270664 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd"] Apr 22 18:58:32.333582 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.333550 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.333746 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.333589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.333746 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.333618 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.333746 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.333676 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.333746 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.333702 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/786ad670-3793-4031-8b37-a96d03af0e32-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.333746 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.333717 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7m56\" (UniqueName: \"kubernetes.io/projected/786ad670-3793-4031-8b37-a96d03af0e32-kube-api-access-p7m56\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.434657 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.434622 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.434657 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.434660 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.434889 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.434685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.434889 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.434850 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.434998 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.434905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/786ad670-3793-4031-8b37-a96d03af0e32-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.434998 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.434943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7m56\" (UniqueName: \"kubernetes.io/projected/786ad670-3793-4031-8b37-a96d03af0e32-kube-api-access-p7m56\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.435155 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.435128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.435332 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.435202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.435332 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.435282 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.435474 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.435457 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.437638 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.437617 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/786ad670-3793-4031-8b37-a96d03af0e32-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.453402 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.453380 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7m56\" (UniqueName: \"kubernetes.io/projected/786ad670-3793-4031-8b37-a96d03af0e32-kube-api-access-p7m56\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.575444 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.575380 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:58:32.695803 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:32.695778 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd"] Apr 22 18:58:32.698621 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:58:32.698594 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod786ad670_3793_4031_8b37_a96d03af0e32.slice/crio-f4c05cb43720a0abd4e63a8ad9890128dbff725d5f0efe83701b303acc775518 WatchSource:0}: Error finding container f4c05cb43720a0abd4e63a8ad9890128dbff725d5f0efe83701b303acc775518: Status 404 returned error can't find the container with id f4c05cb43720a0abd4e63a8ad9890128dbff725d5f0efe83701b303acc775518 Apr 22 18:58:33.550578 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:33.550512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" event={"ID":"786ad670-3793-4031-8b37-a96d03af0e32","Type":"ContainerStarted","Data":"f4c05cb43720a0abd4e63a8ad9890128dbff725d5f0efe83701b303acc775518"} Apr 22 18:58:36.563219 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:36.563150 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" event={"ID":"786ad670-3793-4031-8b37-a96d03af0e32","Type":"ContainerStarted","Data":"267116cbf05461352d03f95ca98364cd8289d4a38c3fc61f014f4d75748e09b3"} Apr 22 18:58:37.567891 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:37.567857 2573 generic.go:358] "Generic (PLEG): container finished" podID="786ad670-3793-4031-8b37-a96d03af0e32" containerID="267116cbf05461352d03f95ca98364cd8289d4a38c3fc61f014f4d75748e09b3" exitCode=0 Apr 22 18:58:37.568278 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:37.567922 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" event={"ID":"786ad670-3793-4031-8b37-a96d03af0e32","Type":"ContainerDied","Data":"267116cbf05461352d03f95ca98364cd8289d4a38c3fc61f014f4d75748e09b3"} Apr 22 18:58:39.582316 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:58:39.582263 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" event={"ID":"786ad670-3793-4031-8b37-a96d03af0e32","Type":"ContainerStarted","Data":"a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f"} Apr 22 18:59:08.699390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:08.699353 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" event={"ID":"786ad670-3793-4031-8b37-a96d03af0e32","Type":"ContainerStarted","Data":"240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8"} Apr 22 18:59:08.699832 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:08.699641 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:59:08.702140 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:08.702116 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:59:08.719828 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:08.719782 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" podStartSLOduration=0.892283679 podStartE2EDuration="36.719769793s" podCreationTimestamp="2026-04-22 18:58:32 +0000 UTC" firstStartedPulling="2026-04-22 18:58:32.70046371 +0000 UTC m=+707.940579790" lastFinishedPulling="2026-04-22 18:59:08.527949808 +0000 UTC m=+743.768065904" observedRunningTime="2026-04-22 18:59:08.717253213 +0000 UTC m=+743.957369323" watchObservedRunningTime="2026-04-22 18:59:08.719769793 +0000 UTC m=+743.959885895" Apr 22 18:59:12.575785 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:12.575742 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:59:12.576274 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:12.575796 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:59:22.578068 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:22.578035 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:59:22.579200 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:22.579162 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:59:23.582374 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:23.582341 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd"] Apr 22 18:59:23.750896 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:23.750850 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" podUID="786ad670-3793-4031-8b37-a96d03af0e32" containerName="main" containerID="cri-o://a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f" gracePeriod=30 Apr 22 18:59:23.750896 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:23.750887 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" podUID="786ad670-3793-4031-8b37-a96d03af0e32" containerName="tokenizer" containerID="cri-o://240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8" gracePeriod=30 Apr 22 18:59:24.755634 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:24.755599 2573 generic.go:358] "Generic (PLEG): container finished" podID="786ad670-3793-4031-8b37-a96d03af0e32" containerID="a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f" exitCode=0 Apr 22 18:59:24.755990 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:24.755656 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" event={"ID":"786ad670-3793-4031-8b37-a96d03af0e32","Type":"ContainerDied","Data":"a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f"} Apr 22 18:59:25.419283 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.419258 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:59:25.497415 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.497388 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-cache\") pod \"786ad670-3793-4031-8b37-a96d03af0e32\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " Apr 22 18:59:25.497586 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.497422 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-tmp\") pod \"786ad670-3793-4031-8b37-a96d03af0e32\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " Apr 22 18:59:25.497586 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.497460 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-kserve-provision-location\") pod \"786ad670-3793-4031-8b37-a96d03af0e32\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " Apr 22 18:59:25.497586 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.497486 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-uds\") pod \"786ad670-3793-4031-8b37-a96d03af0e32\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " Apr 22 18:59:25.497586 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.497520 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7m56\" (UniqueName: \"kubernetes.io/projected/786ad670-3793-4031-8b37-a96d03af0e32-kube-api-access-p7m56\") pod \"786ad670-3793-4031-8b37-a96d03af0e32\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " Apr 22 18:59:25.497586 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.497563 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/786ad670-3793-4031-8b37-a96d03af0e32-tls-certs\") pod \"786ad670-3793-4031-8b37-a96d03af0e32\" (UID: \"786ad670-3793-4031-8b37-a96d03af0e32\") " Apr 22 18:59:25.497828 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.497677 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "786ad670-3793-4031-8b37-a96d03af0e32" (UID: "786ad670-3793-4031-8b37-a96d03af0e32"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:25.497828 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.497788 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "786ad670-3793-4031-8b37-a96d03af0e32" (UID: "786ad670-3793-4031-8b37-a96d03af0e32"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:25.497828 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.497816 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "786ad670-3793-4031-8b37-a96d03af0e32" (UID: "786ad670-3793-4031-8b37-a96d03af0e32"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:25.497984 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.497844 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-uds\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:59:25.497984 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.497854 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-cache\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:59:25.498219 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.498199 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "786ad670-3793-4031-8b37-a96d03af0e32" (UID: "786ad670-3793-4031-8b37-a96d03af0e32"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:25.499947 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.499916 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/786ad670-3793-4031-8b37-a96d03af0e32-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "786ad670-3793-4031-8b37-a96d03af0e32" (UID: "786ad670-3793-4031-8b37-a96d03af0e32"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:59:25.499947 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.499938 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/786ad670-3793-4031-8b37-a96d03af0e32-kube-api-access-p7m56" (OuterVolumeSpecName: "kube-api-access-p7m56") pod "786ad670-3793-4031-8b37-a96d03af0e32" (UID: "786ad670-3793-4031-8b37-a96d03af0e32"). InnerVolumeSpecName "kube-api-access-p7m56". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:59:25.598572 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.598488 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-tokenizer-tmp\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:59:25.598572 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.598518 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/786ad670-3793-4031-8b37-a96d03af0e32-kserve-provision-location\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:59:25.598572 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.598528 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p7m56\" (UniqueName: \"kubernetes.io/projected/786ad670-3793-4031-8b37-a96d03af0e32-kube-api-access-p7m56\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:59:25.598572 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.598538 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/786ad670-3793-4031-8b37-a96d03af0e32-tls-certs\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 18:59:25.761458 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.761421 2573 generic.go:358] "Generic (PLEG): container finished" podID="786ad670-3793-4031-8b37-a96d03af0e32" containerID="240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8" exitCode=0 Apr 22 18:59:25.761847 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.761498 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" Apr 22 18:59:25.761847 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.761497 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" event={"ID":"786ad670-3793-4031-8b37-a96d03af0e32","Type":"ContainerDied","Data":"240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8"} Apr 22 18:59:25.761847 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.761542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd" event={"ID":"786ad670-3793-4031-8b37-a96d03af0e32","Type":"ContainerDied","Data":"f4c05cb43720a0abd4e63a8ad9890128dbff725d5f0efe83701b303acc775518"} Apr 22 18:59:25.761847 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.761560 2573 scope.go:117] "RemoveContainer" containerID="240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8" Apr 22 18:59:25.771338 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.771319 2573 scope.go:117] "RemoveContainer" containerID="a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f" Apr 22 18:59:25.784229 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.784197 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd"] Apr 22 18:59:25.785610 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.785592 2573 scope.go:117] "RemoveContainer" containerID="267116cbf05461352d03f95ca98364cd8289d4a38c3fc61f014f4d75748e09b3" Apr 22 18:59:25.789425 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.789400 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6d6ccm8jjd"] Apr 22 18:59:25.793292 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.793271 2573 scope.go:117] "RemoveContainer" containerID="240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8" Apr 22 18:59:25.793613 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:59:25.793585 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8\": container with ID starting with 240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8 not found: ID does not exist" containerID="240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8" Apr 22 18:59:25.793691 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.793617 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8"} err="failed to get container status \"240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8\": rpc error: code = NotFound desc = could not find container \"240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8\": container with ID starting with 240f6b13a54789cb38f218e13092b0aefce70449311eb8deb58cd8a981bccbb8 not found: ID does not exist" Apr 22 18:59:25.793691 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.793643 2573 scope.go:117] "RemoveContainer" containerID="a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f" Apr 22 18:59:25.793926 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:59:25.793907 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f\": container with ID starting with a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f not found: ID does not exist" containerID="a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f" Apr 22 18:59:25.793965 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.793936 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f"} err="failed to get container status \"a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f\": rpc error: code = NotFound desc = could not find container \"a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f\": container with ID starting with a6a66d6ccccc7e869c337028bd697cb6d4c9d8297980edeaa111e10a56201f3f not found: ID does not exist" Apr 22 18:59:25.793965 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.793953 2573 scope.go:117] "RemoveContainer" containerID="267116cbf05461352d03f95ca98364cd8289d4a38c3fc61f014f4d75748e09b3" Apr 22 18:59:25.794213 ip-10-0-133-84 kubenswrapper[2573]: E0422 18:59:25.794191 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"267116cbf05461352d03f95ca98364cd8289d4a38c3fc61f014f4d75748e09b3\": container with ID starting with 267116cbf05461352d03f95ca98364cd8289d4a38c3fc61f014f4d75748e09b3 not found: ID does not exist" containerID="267116cbf05461352d03f95ca98364cd8289d4a38c3fc61f014f4d75748e09b3" Apr 22 18:59:25.794265 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:25.794223 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"267116cbf05461352d03f95ca98364cd8289d4a38c3fc61f014f4d75748e09b3"} err="failed to get container status \"267116cbf05461352d03f95ca98364cd8289d4a38c3fc61f014f4d75748e09b3\": rpc error: code = NotFound desc = could not find container \"267116cbf05461352d03f95ca98364cd8289d4a38c3fc61f014f4d75748e09b3\": container with ID starting with 267116cbf05461352d03f95ca98364cd8289d4a38c3fc61f014f4d75748e09b3 not found: ID does not exist" Apr 22 18:59:27.364145 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:27.364109 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="786ad670-3793-4031-8b37-a96d03af0e32" path="/var/lib/kubelet/pods/786ad670-3793-4031-8b37-a96d03af0e32/volumes" Apr 22 18:59:30.622392 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.622352 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp"] Apr 22 18:59:30.622893 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.622873 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="786ad670-3793-4031-8b37-a96d03af0e32" containerName="storage-initializer" Apr 22 18:59:30.623042 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.622921 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="786ad670-3793-4031-8b37-a96d03af0e32" containerName="storage-initializer" Apr 22 18:59:30.623042 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.622937 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="786ad670-3793-4031-8b37-a96d03af0e32" containerName="main" Apr 22 18:59:30.623042 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.622947 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="786ad670-3793-4031-8b37-a96d03af0e32" containerName="main" Apr 22 18:59:30.623042 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.622995 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="786ad670-3793-4031-8b37-a96d03af0e32" containerName="tokenizer" Apr 22 18:59:30.623042 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.623004 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="786ad670-3793-4031-8b37-a96d03af0e32" containerName="tokenizer" Apr 22 18:59:30.623338 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.623092 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="786ad670-3793-4031-8b37-a96d03af0e32" containerName="main" Apr 22 18:59:30.623338 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.623111 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="786ad670-3793-4031-8b37-a96d03af0e32" containerName="tokenizer" Apr 22 18:59:30.914557 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.914479 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp"] Apr 22 18:59:30.914717 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.914625 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:30.918921 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.918879 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:59:30.918921 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.918880 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-xjm7q\"" Apr 22 18:59:30.919122 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.918935 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 18:59:30.919122 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.919012 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:59:30.919263 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:30.919154 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mhhb\"" Apr 22 18:59:31.039075 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.039045 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.039075 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.039078 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.039293 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.039105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ee855db9-55e2-469e-96e4-2b438bfc4923-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.039293 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.039210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhd72\" (UniqueName: \"kubernetes.io/projected/ee855db9-55e2-469e-96e4-2b438bfc4923-kube-api-access-vhd72\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.039293 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.039266 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.039390 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.039311 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.140080 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.140046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.140326 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.140157 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.140326 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.140224 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.140326 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.140258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ee855db9-55e2-469e-96e4-2b438bfc4923-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.140326 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.140287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhd72\" (UniqueName: \"kubernetes.io/projected/ee855db9-55e2-469e-96e4-2b438bfc4923-kube-api-access-vhd72\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.140565 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.140341 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.140620 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.140580 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.140671 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.140617 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.140717 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.140683 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.140768 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.140724 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.143441 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.143416 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ee855db9-55e2-469e-96e4-2b438bfc4923-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.148610 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.148580 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhd72\" (UniqueName: \"kubernetes.io/projected/ee855db9-55e2-469e-96e4-2b438bfc4923-kube-api-access-vhd72\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.224888 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.224798 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:31.350924 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.350894 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp"] Apr 22 18:59:31.353404 ip-10-0-133-84 kubenswrapper[2573]: W0422 18:59:31.353373 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee855db9_55e2_469e_96e4_2b438bfc4923.slice/crio-60f10c321b9e0249910e40bb7052fc4b2f3b25965c050e68447f5c13cf791f7a WatchSource:0}: Error finding container 60f10c321b9e0249910e40bb7052fc4b2f3b25965c050e68447f5c13cf791f7a: Status 404 returned error can't find the container with id 60f10c321b9e0249910e40bb7052fc4b2f3b25965c050e68447f5c13cf791f7a Apr 22 18:59:31.786692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.786585 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" event={"ID":"ee855db9-55e2-469e-96e4-2b438bfc4923","Type":"ContainerStarted","Data":"cf00b46de7ba7c1d5d7189b6688c93f5eecedef8757569f2642a51d6d86aba46"} Apr 22 18:59:31.786692 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:31.786637 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" event={"ID":"ee855db9-55e2-469e-96e4-2b438bfc4923","Type":"ContainerStarted","Data":"60f10c321b9e0249910e40bb7052fc4b2f3b25965c050e68447f5c13cf791f7a"} Apr 22 18:59:32.791100 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:32.791066 2573 generic.go:358] "Generic (PLEG): container finished" podID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerID="cf00b46de7ba7c1d5d7189b6688c93f5eecedef8757569f2642a51d6d86aba46" exitCode=0 Apr 22 18:59:32.791557 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:32.791151 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" event={"ID":"ee855db9-55e2-469e-96e4-2b438bfc4923","Type":"ContainerDied","Data":"cf00b46de7ba7c1d5d7189b6688c93f5eecedef8757569f2642a51d6d86aba46"} Apr 22 18:59:33.797217 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:33.797181 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" event={"ID":"ee855db9-55e2-469e-96e4-2b438bfc4923","Type":"ContainerStarted","Data":"c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc"} Apr 22 18:59:33.797217 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:33.797217 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" event={"ID":"ee855db9-55e2-469e-96e4-2b438bfc4923","Type":"ContainerStarted","Data":"606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf"} Apr 22 18:59:33.797696 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:33.797349 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:33.817029 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:33.816976 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" podStartSLOduration=3.816961084 podStartE2EDuration="3.816961084s" podCreationTimestamp="2026-04-22 18:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:59:33.815026465 +0000 UTC m=+769.055142568" watchObservedRunningTime="2026-04-22 18:59:33.816961084 +0000 UTC m=+769.057077185" Apr 22 18:59:41.225446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:41.225411 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:41.225446 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:41.225449 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:41.228034 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:41.228012 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 18:59:41.826709 ip-10-0-133-84 kubenswrapper[2573]: I0422 18:59:41.826683 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 19:00:02.829916 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:02.829835 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 19:00:03.797885 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:03.797841 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp"] Apr 22 19:00:03.798265 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:03.798236 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" podUID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerName="main" containerID="cri-o://606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf" gracePeriod=30 Apr 22 19:00:03.798351 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:03.798271 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" podUID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerName="tokenizer" containerID="cri-o://c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc" gracePeriod=30 Apr 22 19:00:04.901567 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:04.901530 2573 generic.go:358] "Generic (PLEG): container finished" podID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerID="606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf" exitCode=0 Apr 22 19:00:04.901901 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:04.901599 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" event={"ID":"ee855db9-55e2-469e-96e4-2b438bfc4923","Type":"ContainerDied","Data":"606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf"} Apr 22 19:00:05.050445 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.050418 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 19:00:05.214211 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.214089 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-uds\") pod \"ee855db9-55e2-469e-96e4-2b438bfc4923\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " Apr 22 19:00:05.214211 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.214206 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-tmp\") pod \"ee855db9-55e2-469e-96e4-2b438bfc4923\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " Apr 22 19:00:05.214443 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.214247 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ee855db9-55e2-469e-96e4-2b438bfc4923-tls-certs\") pod \"ee855db9-55e2-469e-96e4-2b438bfc4923\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " Apr 22 19:00:05.214443 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.214290 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-kserve-provision-location\") pod \"ee855db9-55e2-469e-96e4-2b438bfc4923\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " Apr 22 19:00:05.214443 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.214322 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-cache\") pod \"ee855db9-55e2-469e-96e4-2b438bfc4923\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " Apr 22 19:00:05.214443 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.214351 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhd72\" (UniqueName: \"kubernetes.io/projected/ee855db9-55e2-469e-96e4-2b438bfc4923-kube-api-access-vhd72\") pod \"ee855db9-55e2-469e-96e4-2b438bfc4923\" (UID: \"ee855db9-55e2-469e-96e4-2b438bfc4923\") " Apr 22 19:00:05.214443 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.214419 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ee855db9-55e2-469e-96e4-2b438bfc4923" (UID: "ee855db9-55e2-469e-96e4-2b438bfc4923"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:05.214696 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.214553 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ee855db9-55e2-469e-96e4-2b438bfc4923" (UID: "ee855db9-55e2-469e-96e4-2b438bfc4923"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:05.214696 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.214606 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ee855db9-55e2-469e-96e4-2b438bfc4923" (UID: "ee855db9-55e2-469e-96e4-2b438bfc4923"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:05.214696 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.214649 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-uds\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:05.214696 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.214667 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-tmp\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:05.215029 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.215010 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ee855db9-55e2-469e-96e4-2b438bfc4923" (UID: "ee855db9-55e2-469e-96e4-2b438bfc4923"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:05.216446 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.216428 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee855db9-55e2-469e-96e4-2b438bfc4923-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ee855db9-55e2-469e-96e4-2b438bfc4923" (UID: "ee855db9-55e2-469e-96e4-2b438bfc4923"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:00:05.216543 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.216523 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee855db9-55e2-469e-96e4-2b438bfc4923-kube-api-access-vhd72" (OuterVolumeSpecName: "kube-api-access-vhd72") pod "ee855db9-55e2-469e-96e4-2b438bfc4923" (UID: "ee855db9-55e2-469e-96e4-2b438bfc4923"). InnerVolumeSpecName "kube-api-access-vhd72". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:00:05.315293 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.315258 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ee855db9-55e2-469e-96e4-2b438bfc4923-tls-certs\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:05.315293 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.315287 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-kserve-provision-location\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:05.315293 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.315298 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ee855db9-55e2-469e-96e4-2b438bfc4923-tokenizer-cache\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:05.315517 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.315307 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vhd72\" (UniqueName: \"kubernetes.io/projected/ee855db9-55e2-469e-96e4-2b438bfc4923-kube-api-access-vhd72\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:05.906983 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.906945 2573 generic.go:358] "Generic (PLEG): container finished" podID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerID="c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc" exitCode=0 Apr 22 19:00:05.907407 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.907042 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" event={"ID":"ee855db9-55e2-469e-96e4-2b438bfc4923","Type":"ContainerDied","Data":"c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc"} Apr 22 19:00:05.907407 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.907073 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" Apr 22 19:00:05.907407 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.907085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp" event={"ID":"ee855db9-55e2-469e-96e4-2b438bfc4923","Type":"ContainerDied","Data":"60f10c321b9e0249910e40bb7052fc4b2f3b25965c050e68447f5c13cf791f7a"} Apr 22 19:00:05.907407 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.907101 2573 scope.go:117] "RemoveContainer" containerID="c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc" Apr 22 19:00:05.915326 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.915310 2573 scope.go:117] "RemoveContainer" containerID="606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf" Apr 22 19:00:05.923427 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.923147 2573 scope.go:117] "RemoveContainer" containerID="cf00b46de7ba7c1d5d7189b6688c93f5eecedef8757569f2642a51d6d86aba46" Apr 22 19:00:05.924384 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.924364 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp"] Apr 22 19:00:05.927506 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.927486 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97d5fl85rp"] Apr 22 19:00:05.930853 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.930835 2573 scope.go:117] "RemoveContainer" containerID="c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc" Apr 22 19:00:05.931104 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:00:05.931088 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc\": container with ID starting with c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc not found: ID does not exist" containerID="c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc" Apr 22 19:00:05.931150 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.931111 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc"} err="failed to get container status \"c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc\": rpc error: code = NotFound desc = could not find container \"c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc\": container with ID starting with c3d40e0db81290c439f2bc5be042d49a4bb0219ebcf49687c4c671a3dc9b53fc not found: ID does not exist" Apr 22 19:00:05.931150 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.931129 2573 scope.go:117] "RemoveContainer" containerID="606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf" Apr 22 19:00:05.931379 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:00:05.931358 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf\": container with ID starting with 606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf not found: ID does not exist" containerID="606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf" Apr 22 19:00:05.931418 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.931385 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf"} err="failed to get container status \"606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf\": rpc error: code = NotFound desc = could not find container \"606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf\": container with ID starting with 606bf09a465dde8c3835e20d0795b6f9d407bcfe4bc3f0358ad07a8b671606bf not found: ID does not exist" Apr 22 19:00:05.931418 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.931399 2573 scope.go:117] "RemoveContainer" containerID="cf00b46de7ba7c1d5d7189b6688c93f5eecedef8757569f2642a51d6d86aba46" Apr 22 19:00:05.931640 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:00:05.931622 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf00b46de7ba7c1d5d7189b6688c93f5eecedef8757569f2642a51d6d86aba46\": container with ID starting with cf00b46de7ba7c1d5d7189b6688c93f5eecedef8757569f2642a51d6d86aba46 not found: ID does not exist" containerID="cf00b46de7ba7c1d5d7189b6688c93f5eecedef8757569f2642a51d6d86aba46" Apr 22 19:00:05.931689 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:05.931646 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf00b46de7ba7c1d5d7189b6688c93f5eecedef8757569f2642a51d6d86aba46"} err="failed to get container status \"cf00b46de7ba7c1d5d7189b6688c93f5eecedef8757569f2642a51d6d86aba46\": rpc error: code = NotFound desc = could not find container \"cf00b46de7ba7c1d5d7189b6688c93f5eecedef8757569f2642a51d6d86aba46\": container with ID starting with cf00b46de7ba7c1d5d7189b6688c93f5eecedef8757569f2642a51d6d86aba46 not found: ID does not exist" Apr 22 19:00:06.837226 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.837193 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52"] Apr 22 19:00:06.837582 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.837564 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerName="tokenizer" Apr 22 19:00:06.837664 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.837585 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerName="tokenizer" Apr 22 19:00:06.837664 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.837613 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerName="main" Apr 22 19:00:06.837664 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.837622 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerName="main" Apr 22 19:00:06.837664 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.837647 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerName="storage-initializer" Apr 22 19:00:06.837664 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.837657 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerName="storage-initializer" Apr 22 19:00:06.837919 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.837758 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerName="main" Apr 22 19:00:06.837919 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.837771 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee855db9-55e2-469e-96e4-2b438bfc4923" containerName="tokenizer" Apr 22 19:00:06.842661 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.842638 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:06.845566 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.845538 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 19:00:06.845677 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.845586 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mhhb\"" Apr 22 19:00:06.845788 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.845774 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:00:06.846642 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.846627 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 19:00:06.850762 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:06.850739 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52"] Apr 22 19:00:07.030524 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.030488 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-home\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.030524 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.030539 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn7v2\" (UniqueName: \"kubernetes.io/projected/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-kube-api-access-bn7v2\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.030957 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.030592 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-dshm\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.030957 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.030671 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-model-cache\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.030957 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.030693 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-tls-certs\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.030957 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.030723 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.132060 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.132018 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-model-cache\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.132060 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.132068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-tls-certs\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.132347 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.132105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.132347 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.132152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-home\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.132347 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.132240 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn7v2\" (UniqueName: \"kubernetes.io/projected/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-kube-api-access-bn7v2\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.132347 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.132266 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-dshm\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.132529 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.132469 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-model-cache\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.132529 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.132490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.132970 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.132880 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-home\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.134845 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.134814 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-dshm\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.134951 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.134822 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-tls-certs\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.139988 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.139963 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d"] Apr 22 19:00:07.144036 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.144015 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.146478 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.146458 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-fvhrx\"" Apr 22 19:00:07.147740 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.147717 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn7v2\" (UniqueName: \"kubernetes.io/projected/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-kube-api-access-bn7v2\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-p6d52\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.153656 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.153614 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d"] Apr 22 19:00:07.153756 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.153733 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:07.282561 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.282450 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52"] Apr 22 19:00:07.285600 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:00:07.285572 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod884c537f_8d1a_4573_9b69_1eb9e1f8c3e9.slice/crio-8d517fe862f0b1e54993ee772532b1e8ef411d880ef8480bfadeb4480b8451c6 WatchSource:0}: Error finding container 8d517fe862f0b1e54993ee772532b1e8ef411d880ef8480bfadeb4480b8451c6: Status 404 returned error can't find the container with id 8d517fe862f0b1e54993ee772532b1e8ef411d880ef8480bfadeb4480b8451c6 Apr 22 19:00:07.334881 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.334853 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.334995 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.334898 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdmv7\" (UniqueName: \"kubernetes.io/projected/cd0a0692-d9e4-4848-a7df-452c680c6062-kube-api-access-zdmv7\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.334995 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.334961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.335066 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.334999 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.335066 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.335018 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.335066 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.335048 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0a0692-d9e4-4848-a7df-452c680c6062-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.364155 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.364122 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee855db9-55e2-469e-96e4-2b438bfc4923" path="/var/lib/kubelet/pods/ee855db9-55e2-469e-96e4-2b438bfc4923/volumes" Apr 22 19:00:07.435543 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.435458 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.435543 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.435503 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdmv7\" (UniqueName: \"kubernetes.io/projected/cd0a0692-d9e4-4848-a7df-452c680c6062-kube-api-access-zdmv7\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.435543 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.435537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.435791 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.435562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.435791 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.435579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.435791 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.435599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0a0692-d9e4-4848-a7df-452c680c6062-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.436210 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.435924 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.436210 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.435936 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.436210 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.436000 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.436210 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.436143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.438590 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.438567 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0a0692-d9e4-4848-a7df-452c680c6062-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.446338 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.446308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdmv7\" (UniqueName: \"kubernetes.io/projected/cd0a0692-d9e4-4848-a7df-452c680c6062-kube-api-access-zdmv7\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.479228 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.479187 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:07.615746 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.615716 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d"] Apr 22 19:00:07.621089 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:00:07.621046 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd0a0692_d9e4_4848_a7df_452c680c6062.slice/crio-fbfd6967f05006eca6b1ccc69561d8ac8ed786910ab268012f800d5eb712905c WatchSource:0}: Error finding container fbfd6967f05006eca6b1ccc69561d8ac8ed786910ab268012f800d5eb712905c: Status 404 returned error can't find the container with id fbfd6967f05006eca6b1ccc69561d8ac8ed786910ab268012f800d5eb712905c Apr 22 19:00:07.916419 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.916378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" event={"ID":"cd0a0692-d9e4-4848-a7df-452c680c6062","Type":"ContainerStarted","Data":"2246c4d94ff3ac383d7ec0c506f3fc3098e56bc3763526568026d43b74726df7"} Apr 22 19:00:07.916601 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.916435 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" event={"ID":"cd0a0692-d9e4-4848-a7df-452c680c6062","Type":"ContainerStarted","Data":"fbfd6967f05006eca6b1ccc69561d8ac8ed786910ab268012f800d5eb712905c"} Apr 22 19:00:07.917876 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.917849 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" event={"ID":"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9","Type":"ContainerStarted","Data":"df4766343c3ce30a1a9e8addfa3bb58bdda1c9277f67f0ae254f3d9687801089"} Apr 22 19:00:07.918026 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:07.917892 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" event={"ID":"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9","Type":"ContainerStarted","Data":"8d517fe862f0b1e54993ee772532b1e8ef411d880ef8480bfadeb4480b8451c6"} Apr 22 19:00:08.923069 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:08.923028 2573 generic.go:358] "Generic (PLEG): container finished" podID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerID="2246c4d94ff3ac383d7ec0c506f3fc3098e56bc3763526568026d43b74726df7" exitCode=0 Apr 22 19:00:08.923564 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:08.923107 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" event={"ID":"cd0a0692-d9e4-4848-a7df-452c680c6062","Type":"ContainerDied","Data":"2246c4d94ff3ac383d7ec0c506f3fc3098e56bc3763526568026d43b74726df7"} Apr 22 19:00:09.930642 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:09.930598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" event={"ID":"cd0a0692-d9e4-4848-a7df-452c680c6062","Type":"ContainerStarted","Data":"bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf"} Apr 22 19:00:09.930642 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:09.930646 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" event={"ID":"cd0a0692-d9e4-4848-a7df-452c680c6062","Type":"ContainerStarted","Data":"f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563"} Apr 22 19:00:09.931200 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:09.930690 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:09.951415 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:09.951354 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" podStartSLOduration=2.951332315 podStartE2EDuration="2.951332315s" podCreationTimestamp="2026-04-22 19:00:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:00:09.948469887 +0000 UTC m=+805.188585994" watchObservedRunningTime="2026-04-22 19:00:09.951332315 +0000 UTC m=+805.191448419" Apr 22 19:00:11.939646 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:11.939610 2573 generic.go:358] "Generic (PLEG): container finished" podID="884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" containerID="df4766343c3ce30a1a9e8addfa3bb58bdda1c9277f67f0ae254f3d9687801089" exitCode=0 Apr 22 19:00:11.940129 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:11.939690 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" event={"ID":"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9","Type":"ContainerDied","Data":"df4766343c3ce30a1a9e8addfa3bb58bdda1c9277f67f0ae254f3d9687801089"} Apr 22 19:00:17.480149 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:17.480114 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:17.480599 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:17.480282 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:17.481486 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:00:17.481467 2573 logging.go:55] [core] [Channel #41 SubChannel #42]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.28:9003", ServerName: "10.133.0.28:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.28:9003: connect: connection refused" Apr 22 19:00:17.482724 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:17.482708 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:17.963704 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:17.963676 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:18.480678 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:18.480637 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.28:9003\" within 1s: context deadline exceeded" Apr 22 19:00:23.207655 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.207622 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv"] Apr 22 19:00:23.243884 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.243853 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv"] Apr 22 19:00:23.244029 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.243992 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.246560 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.246534 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-c7rcb\"" Apr 22 19:00:23.246686 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.246585 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 22 19:00:23.255837 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.255810 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf931ff-735a-4729-96fa-fbc57d4428c0-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.255948 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.255848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.255948 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.255875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfnmg\" (UniqueName: \"kubernetes.io/projected/fcf931ff-735a-4729-96fa-fbc57d4428c0-kube-api-access-vfnmg\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.255948 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.255932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.256123 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.255992 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.256123 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.256023 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.357098 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.357057 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.357406 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.357384 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfnmg\" (UniqueName: \"kubernetes.io/projected/fcf931ff-735a-4729-96fa-fbc57d4428c0-kube-api-access-vfnmg\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.357801 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.357775 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.357964 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.357945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.358138 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.358122 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.358348 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.357591 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.358436 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.358195 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.358436 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.358336 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.358554 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.358497 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.358627 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.358598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf931ff-735a-4729-96fa-fbc57d4428c0-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.362196 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.362055 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf931ff-735a-4729-96fa-fbc57d4428c0-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.366036 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.366011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfnmg\" (UniqueName: \"kubernetes.io/projected/fcf931ff-735a-4729-96fa-fbc57d4428c0-kube-api-access-vfnmg\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.554664 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.554582 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:23.711964 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.711852 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv"] Apr 22 19:00:23.714483 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:00:23.714454 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcf931ff_735a_4729_96fa_fbc57d4428c0.slice/crio-b1df102f5b7638fa69b30f77253bccdcbcb5a2b15978635530353f71c3d727cf WatchSource:0}: Error finding container b1df102f5b7638fa69b30f77253bccdcbcb5a2b15978635530353f71c3d727cf: Status 404 returned error can't find the container with id b1df102f5b7638fa69b30f77253bccdcbcb5a2b15978635530353f71c3d727cf Apr 22 19:00:23.987238 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.987141 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" event={"ID":"fcf931ff-735a-4729-96fa-fbc57d4428c0","Type":"ContainerStarted","Data":"ae9b9b74466cd2dcc48d1698944da7e6aff24f73a76f882b354246ffd1168b54"} Apr 22 19:00:23.987238 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:23.987207 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" event={"ID":"fcf931ff-735a-4729-96fa-fbc57d4428c0","Type":"ContainerStarted","Data":"b1df102f5b7638fa69b30f77253bccdcbcb5a2b15978635530353f71c3d727cf"} Apr 22 19:00:24.993563 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:24.993484 2573 generic.go:358] "Generic (PLEG): container finished" podID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerID="ae9b9b74466cd2dcc48d1698944da7e6aff24f73a76f882b354246ffd1168b54" exitCode=0 Apr 22 19:00:24.994107 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:24.993585 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" event={"ID":"fcf931ff-735a-4729-96fa-fbc57d4428c0","Type":"ContainerDied","Data":"ae9b9b74466cd2dcc48d1698944da7e6aff24f73a76f882b354246ffd1168b54"} Apr 22 19:00:24.995413 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:24.995380 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" event={"ID":"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9","Type":"ContainerStarted","Data":"6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b"} Apr 22 19:00:25.028762 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:25.028713 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" podStartSLOduration=6.306601178 podStartE2EDuration="19.028698653s" podCreationTimestamp="2026-04-22 19:00:06 +0000 UTC" firstStartedPulling="2026-04-22 19:00:11.9408953 +0000 UTC m=+807.181011380" lastFinishedPulling="2026-04-22 19:00:24.662992761 +0000 UTC m=+819.903108855" observedRunningTime="2026-04-22 19:00:25.027475931 +0000 UTC m=+820.267592033" watchObservedRunningTime="2026-04-22 19:00:25.028698653 +0000 UTC m=+820.268814756" Apr 22 19:00:26.001105 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:26.001060 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" event={"ID":"fcf931ff-735a-4729-96fa-fbc57d4428c0","Type":"ContainerStarted","Data":"a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552"} Apr 22 19:00:26.001105 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:26.001109 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" event={"ID":"fcf931ff-735a-4729-96fa-fbc57d4428c0","Type":"ContainerStarted","Data":"34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196"} Apr 22 19:00:26.001582 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:26.001213 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:26.022757 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:26.022690 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" podStartSLOduration=3.02267023 podStartE2EDuration="3.02267023s" podCreationTimestamp="2026-04-22 19:00:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:00:26.01975309 +0000 UTC m=+821.259869204" watchObservedRunningTime="2026-04-22 19:00:26.02267023 +0000 UTC m=+821.262786333" Apr 22 19:00:27.154397 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:27.154356 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:27.154397 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:27.154406 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:27.167214 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:27.167159 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:27.480405 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:00:27.480308 2573 logging.go:55] [core] [Channel #43 SubChannel #44]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.28:9003", ServerName: "10.133.0.28:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.28:9003: connect: connection refused" Apr 22 19:00:28.019749 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:28.019717 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:28.480548 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:28.480500 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.28:9003\" within 1s: context deadline exceeded" Apr 22 19:00:33.555412 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:33.555374 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:33.555412 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:33.555413 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:33.557959 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:33.557934 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:34.031118 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:34.031088 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:00:39.971597 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:39.971570 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:41.112350 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.112304 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52"] Apr 22 19:00:41.112781 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.112714 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" podUID="884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" containerName="main" containerID="cri-o://6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b" gracePeriod=30 Apr 22 19:00:41.120828 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.120805 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d"] Apr 22 19:00:41.121230 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.121137 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerName="main" containerID="cri-o://f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563" gracePeriod=30 Apr 22 19:00:41.121332 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.121250 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerName="tokenizer" containerID="cri-o://bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf" gracePeriod=30 Apr 22 19:00:41.391378 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.391355 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:41.521925 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.521884 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-model-cache\") pod \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " Apr 22 19:00:41.522142 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.521958 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn7v2\" (UniqueName: \"kubernetes.io/projected/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-kube-api-access-bn7v2\") pod \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " Apr 22 19:00:41.522142 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.521996 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-home\") pod \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " Apr 22 19:00:41.522142 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.522060 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-tls-certs\") pod \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " Apr 22 19:00:41.522142 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.522094 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-dshm\") pod \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " Apr 22 19:00:41.522142 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.522125 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-kserve-provision-location\") pod \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\" (UID: \"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9\") " Apr 22 19:00:41.522142 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.522127 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-model-cache" (OuterVolumeSpecName: "model-cache") pod "884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" (UID: "884c537f-8d1a-4573-9b69-1eb9e1f8c3e9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:41.522487 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.522312 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-home" (OuterVolumeSpecName: "home") pod "884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" (UID: "884c537f-8d1a-4573-9b69-1eb9e1f8c3e9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:41.522487 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.522408 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-model-cache\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:41.522487 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.522424 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-home\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:41.524411 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.524375 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-dshm" (OuterVolumeSpecName: "dshm") pod "884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" (UID: "884c537f-8d1a-4573-9b69-1eb9e1f8c3e9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:41.524530 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.524436 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-kube-api-access-bn7v2" (OuterVolumeSpecName: "kube-api-access-bn7v2") pod "884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" (UID: "884c537f-8d1a-4573-9b69-1eb9e1f8c3e9"). InnerVolumeSpecName "kube-api-access-bn7v2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:00:41.524579 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.524528 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" (UID: "884c537f-8d1a-4573-9b69-1eb9e1f8c3e9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:00:41.583525 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.583473 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" (UID: "884c537f-8d1a-4573-9b69-1eb9e1f8c3e9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:41.623274 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.623241 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bn7v2\" (UniqueName: \"kubernetes.io/projected/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-kube-api-access-bn7v2\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:41.623456 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.623277 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-tls-certs\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:41.623456 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.623293 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-dshm\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:41.623456 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:41.623305 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9-kserve-provision-location\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:42.059693 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.059601 2573 generic.go:358] "Generic (PLEG): container finished" podID="884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" containerID="6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b" exitCode=0 Apr 22 19:00:42.059693 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.059685 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" Apr 22 19:00:42.059896 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.059690 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" event={"ID":"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9","Type":"ContainerDied","Data":"6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b"} Apr 22 19:00:42.059896 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.059739 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52" event={"ID":"884c537f-8d1a-4573-9b69-1eb9e1f8c3e9","Type":"ContainerDied","Data":"8d517fe862f0b1e54993ee772532b1e8ef411d880ef8480bfadeb4480b8451c6"} Apr 22 19:00:42.059896 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.059759 2573 scope.go:117] "RemoveContainer" containerID="6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b" Apr 22 19:00:42.063013 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.062991 2573 generic.go:358] "Generic (PLEG): container finished" podID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerID="f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563" exitCode=0 Apr 22 19:00:42.063134 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.063027 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" event={"ID":"cd0a0692-d9e4-4848-a7df-452c680c6062","Type":"ContainerDied","Data":"f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563"} Apr 22 19:00:42.069357 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.069339 2573 scope.go:117] "RemoveContainer" containerID="df4766343c3ce30a1a9e8addfa3bb58bdda1c9277f67f0ae254f3d9687801089" Apr 22 19:00:42.079852 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.079822 2573 scope.go:117] "RemoveContainer" containerID="6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b" Apr 22 19:00:42.080296 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:00:42.080124 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b\": container with ID starting with 6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b not found: ID does not exist" containerID="6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b" Apr 22 19:00:42.080296 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.080191 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b"} err="failed to get container status \"6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b\": rpc error: code = NotFound desc = could not find container \"6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b\": container with ID starting with 6b7d870fd9f6c45f691bde950a8a256e6a1ef701f576ea989bd9b6257f65e42b not found: ID does not exist" Apr 22 19:00:42.080296 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.080219 2573 scope.go:117] "RemoveContainer" containerID="df4766343c3ce30a1a9e8addfa3bb58bdda1c9277f67f0ae254f3d9687801089" Apr 22 19:00:42.080794 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:00:42.080755 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df4766343c3ce30a1a9e8addfa3bb58bdda1c9277f67f0ae254f3d9687801089\": container with ID starting with df4766343c3ce30a1a9e8addfa3bb58bdda1c9277f67f0ae254f3d9687801089 not found: ID does not exist" containerID="df4766343c3ce30a1a9e8addfa3bb58bdda1c9277f67f0ae254f3d9687801089" Apr 22 19:00:42.080898 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.080804 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df4766343c3ce30a1a9e8addfa3bb58bdda1c9277f67f0ae254f3d9687801089"} err="failed to get container status \"df4766343c3ce30a1a9e8addfa3bb58bdda1c9277f67f0ae254f3d9687801089\": rpc error: code = NotFound desc = could not find container \"df4766343c3ce30a1a9e8addfa3bb58bdda1c9277f67f0ae254f3d9687801089\": container with ID starting with df4766343c3ce30a1a9e8addfa3bb58bdda1c9277f67f0ae254f3d9687801089 not found: ID does not exist" Apr 22 19:00:42.081663 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.081643 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52"] Apr 22 19:00:42.085351 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.085331 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-p6d52"] Apr 22 19:00:42.564868 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.564843 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:42.633037 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.633000 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdmv7\" (UniqueName: \"kubernetes.io/projected/cd0a0692-d9e4-4848-a7df-452c680c6062-kube-api-access-zdmv7\") pod \"cd0a0692-d9e4-4848-a7df-452c680c6062\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " Apr 22 19:00:42.633203 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.633072 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-tmp\") pod \"cd0a0692-d9e4-4848-a7df-452c680c6062\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " Apr 22 19:00:42.633203 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.633094 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-uds\") pod \"cd0a0692-d9e4-4848-a7df-452c680c6062\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " Apr 22 19:00:42.633203 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.633115 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-cache\") pod \"cd0a0692-d9e4-4848-a7df-452c680c6062\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " Apr 22 19:00:42.633203 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.633139 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0a0692-d9e4-4848-a7df-452c680c6062-tls-certs\") pod \"cd0a0692-d9e4-4848-a7df-452c680c6062\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " Apr 22 19:00:42.633203 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.633184 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-kserve-provision-location\") pod \"cd0a0692-d9e4-4848-a7df-452c680c6062\" (UID: \"cd0a0692-d9e4-4848-a7df-452c680c6062\") " Apr 22 19:00:42.633489 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.633386 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "cd0a0692-d9e4-4848-a7df-452c680c6062" (UID: "cd0a0692-d9e4-4848-a7df-452c680c6062"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:42.633547 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.633522 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "cd0a0692-d9e4-4848-a7df-452c680c6062" (UID: "cd0a0692-d9e4-4848-a7df-452c680c6062"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:42.633547 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.633530 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "cd0a0692-d9e4-4848-a7df-452c680c6062" (UID: "cd0a0692-d9e4-4848-a7df-452c680c6062"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:42.633884 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.633865 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cd0a0692-d9e4-4848-a7df-452c680c6062" (UID: "cd0a0692-d9e4-4848-a7df-452c680c6062"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:42.635366 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.635341 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd0a0692-d9e4-4848-a7df-452c680c6062-kube-api-access-zdmv7" (OuterVolumeSpecName: "kube-api-access-zdmv7") pod "cd0a0692-d9e4-4848-a7df-452c680c6062" (UID: "cd0a0692-d9e4-4848-a7df-452c680c6062"). InnerVolumeSpecName "kube-api-access-zdmv7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:00:42.635467 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.635449 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd0a0692-d9e4-4848-a7df-452c680c6062-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "cd0a0692-d9e4-4848-a7df-452c680c6062" (UID: "cd0a0692-d9e4-4848-a7df-452c680c6062"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:00:42.734766 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.734736 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdmv7\" (UniqueName: \"kubernetes.io/projected/cd0a0692-d9e4-4848-a7df-452c680c6062-kube-api-access-zdmv7\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:42.734766 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.734760 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-tmp\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:42.734766 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.734770 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-uds\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:42.734961 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.734779 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-tokenizer-cache\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:42.734961 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.734788 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0a0692-d9e4-4848-a7df-452c680c6062-tls-certs\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:42.734961 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:42.734798 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0a0692-d9e4-4848-a7df-452c680c6062-kserve-provision-location\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:00:43.068932 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.068847 2573 generic.go:358] "Generic (PLEG): container finished" podID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerID="bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf" exitCode=0 Apr 22 19:00:43.068932 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.068892 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" event={"ID":"cd0a0692-d9e4-4848-a7df-452c680c6062","Type":"ContainerDied","Data":"bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf"} Apr 22 19:00:43.068932 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.068925 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" event={"ID":"cd0a0692-d9e4-4848-a7df-452c680c6062","Type":"ContainerDied","Data":"fbfd6967f05006eca6b1ccc69561d8ac8ed786910ab268012f800d5eb712905c"} Apr 22 19:00:43.069161 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.068944 2573 scope.go:117] "RemoveContainer" containerID="bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf" Apr 22 19:00:43.069161 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.068959 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d" Apr 22 19:00:43.077918 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.077900 2573 scope.go:117] "RemoveContainer" containerID="f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563" Apr 22 19:00:43.085011 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.084991 2573 scope.go:117] "RemoveContainer" containerID="2246c4d94ff3ac383d7ec0c506f3fc3098e56bc3763526568026d43b74726df7" Apr 22 19:00:43.090475 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.090452 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d"] Apr 22 19:00:43.094470 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.094435 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68bcf977tc69d"] Apr 22 19:00:43.094925 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.094906 2573 scope.go:117] "RemoveContainer" containerID="bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf" Apr 22 19:00:43.095218 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:00:43.095201 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf\": container with ID starting with bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf not found: ID does not exist" containerID="bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf" Apr 22 19:00:43.095261 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.095226 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf"} err="failed to get container status \"bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf\": rpc error: code = NotFound desc = could not find container \"bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf\": container with ID starting with bf84990e724b77fcad81316120bd6f339dff0bd935266d1197a10a74a8cd43cf not found: ID does not exist" Apr 22 19:00:43.095261 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.095243 2573 scope.go:117] "RemoveContainer" containerID="f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563" Apr 22 19:00:43.095464 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:00:43.095449 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563\": container with ID starting with f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563 not found: ID does not exist" containerID="f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563" Apr 22 19:00:43.095497 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.095468 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563"} err="failed to get container status \"f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563\": rpc error: code = NotFound desc = could not find container \"f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563\": container with ID starting with f8af1d968ea643b5cff010a8349cc1bf5e777830b9a0724e1ca809ea97f8b563 not found: ID does not exist" Apr 22 19:00:43.095497 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.095481 2573 scope.go:117] "RemoveContainer" containerID="2246c4d94ff3ac383d7ec0c506f3fc3098e56bc3763526568026d43b74726df7" Apr 22 19:00:43.095697 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:00:43.095677 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2246c4d94ff3ac383d7ec0c506f3fc3098e56bc3763526568026d43b74726df7\": container with ID starting with 2246c4d94ff3ac383d7ec0c506f3fc3098e56bc3763526568026d43b74726df7 not found: ID does not exist" containerID="2246c4d94ff3ac383d7ec0c506f3fc3098e56bc3763526568026d43b74726df7" Apr 22 19:00:43.095737 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.095703 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2246c4d94ff3ac383d7ec0c506f3fc3098e56bc3763526568026d43b74726df7"} err="failed to get container status \"2246c4d94ff3ac383d7ec0c506f3fc3098e56bc3763526568026d43b74726df7\": rpc error: code = NotFound desc = could not find container \"2246c4d94ff3ac383d7ec0c506f3fc3098e56bc3763526568026d43b74726df7\": container with ID starting with 2246c4d94ff3ac383d7ec0c506f3fc3098e56bc3763526568026d43b74726df7 not found: ID does not exist" Apr 22 19:00:43.364028 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.363956 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" path="/var/lib/kubelet/pods/884c537f-8d1a-4573-9b69-1eb9e1f8c3e9/volumes" Apr 22 19:00:43.364389 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:43.364376 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" path="/var/lib/kubelet/pods/cd0a0692-d9e4-4848-a7df-452c680c6062/volumes" Apr 22 19:00:55.035204 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:00:55.035156 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:01:45.332449 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:01:45.332418 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 19:01:45.333547 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:01:45.333527 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 19:03:32.741543 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:32.741512 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv"] Apr 22 19:03:32.742126 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:32.741929 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" podUID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerName="main" containerID="cri-o://34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196" gracePeriod=30 Apr 22 19:03:32.742228 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:32.742093 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" podUID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerName="tokenizer" containerID="cri-o://a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552" gracePeriod=30 Apr 22 19:03:33.648026 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:33.647990 2573 generic.go:358] "Generic (PLEG): container finished" podID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerID="34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196" exitCode=0 Apr 22 19:03:33.648232 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:33.648063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" event={"ID":"fcf931ff-735a-4729-96fa-fbc57d4428c0","Type":"ContainerDied","Data":"34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196"} Apr 22 19:03:33.983838 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:33.983818 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:03:34.015409 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.015378 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-uds\") pod \"fcf931ff-735a-4729-96fa-fbc57d4428c0\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " Apr 22 19:03:34.015564 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.015451 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfnmg\" (UniqueName: \"kubernetes.io/projected/fcf931ff-735a-4729-96fa-fbc57d4428c0-kube-api-access-vfnmg\") pod \"fcf931ff-735a-4729-96fa-fbc57d4428c0\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " Apr 22 19:03:34.015564 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.015479 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-cache\") pod \"fcf931ff-735a-4729-96fa-fbc57d4428c0\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " Apr 22 19:03:34.015564 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.015510 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-kserve-provision-location\") pod \"fcf931ff-735a-4729-96fa-fbc57d4428c0\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " Apr 22 19:03:34.015564 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.015544 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-tmp\") pod \"fcf931ff-735a-4729-96fa-fbc57d4428c0\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " Apr 22 19:03:34.015741 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.015601 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf931ff-735a-4729-96fa-fbc57d4428c0-tls-certs\") pod \"fcf931ff-735a-4729-96fa-fbc57d4428c0\" (UID: \"fcf931ff-735a-4729-96fa-fbc57d4428c0\") " Apr 22 19:03:34.015741 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.015666 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "fcf931ff-735a-4729-96fa-fbc57d4428c0" (UID: "fcf931ff-735a-4729-96fa-fbc57d4428c0"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:34.015850 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.015769 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "fcf931ff-735a-4729-96fa-fbc57d4428c0" (UID: "fcf931ff-735a-4729-96fa-fbc57d4428c0"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:34.015909 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.015881 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-cache\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:03:34.015909 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.015900 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-uds\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:03:34.016008 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.015974 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "fcf931ff-735a-4729-96fa-fbc57d4428c0" (UID: "fcf931ff-735a-4729-96fa-fbc57d4428c0"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:34.016561 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.016535 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fcf931ff-735a-4729-96fa-fbc57d4428c0" (UID: "fcf931ff-735a-4729-96fa-fbc57d4428c0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:34.017794 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.017770 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf931ff-735a-4729-96fa-fbc57d4428c0-kube-api-access-vfnmg" (OuterVolumeSpecName: "kube-api-access-vfnmg") pod "fcf931ff-735a-4729-96fa-fbc57d4428c0" (UID: "fcf931ff-735a-4729-96fa-fbc57d4428c0"). InnerVolumeSpecName "kube-api-access-vfnmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:03:34.017866 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.017848 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf931ff-735a-4729-96fa-fbc57d4428c0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fcf931ff-735a-4729-96fa-fbc57d4428c0" (UID: "fcf931ff-735a-4729-96fa-fbc57d4428c0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:03:34.116470 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.116434 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf931ff-735a-4729-96fa-fbc57d4428c0-tls-certs\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:03:34.116470 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.116470 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vfnmg\" (UniqueName: \"kubernetes.io/projected/fcf931ff-735a-4729-96fa-fbc57d4428c0-kube-api-access-vfnmg\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:03:34.116669 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.116483 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-kserve-provision-location\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:03:34.116669 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.116492 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcf931ff-735a-4729-96fa-fbc57d4428c0-tokenizer-tmp\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:03:34.652937 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.652900 2573 generic.go:358] "Generic (PLEG): container finished" podID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerID="a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552" exitCode=0 Apr 22 19:03:34.653124 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.652956 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" event={"ID":"fcf931ff-735a-4729-96fa-fbc57d4428c0","Type":"ContainerDied","Data":"a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552"} Apr 22 19:03:34.653124 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.652982 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" event={"ID":"fcf931ff-735a-4729-96fa-fbc57d4428c0","Type":"ContainerDied","Data":"b1df102f5b7638fa69b30f77253bccdcbcb5a2b15978635530353f71c3d727cf"} Apr 22 19:03:34.653124 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.652982 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv" Apr 22 19:03:34.653124 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.653007 2573 scope.go:117] "RemoveContainer" containerID="a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552" Apr 22 19:03:34.661691 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.661673 2573 scope.go:117] "RemoveContainer" containerID="34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196" Apr 22 19:03:34.668913 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.668894 2573 scope.go:117] "RemoveContainer" containerID="ae9b9b74466cd2dcc48d1698944da7e6aff24f73a76f882b354246ffd1168b54" Apr 22 19:03:34.672764 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.672740 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv"] Apr 22 19:03:34.676729 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.676707 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecfjfv"] Apr 22 19:03:34.677673 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.677655 2573 scope.go:117] "RemoveContainer" containerID="a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552" Apr 22 19:03:34.677951 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:03:34.677931 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552\": container with ID starting with a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552 not found: ID does not exist" containerID="a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552" Apr 22 19:03:34.678017 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.677963 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552"} err="failed to get container status \"a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552\": rpc error: code = NotFound desc = could not find container \"a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552\": container with ID starting with a60ae62b4728b65f87ef543ff7ccbd95001c5b736c8660d3883b24a235be6552 not found: ID does not exist" Apr 22 19:03:34.678017 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.677989 2573 scope.go:117] "RemoveContainer" containerID="34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196" Apr 22 19:03:34.678258 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:03:34.678241 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196\": container with ID starting with 34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196 not found: ID does not exist" containerID="34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196" Apr 22 19:03:34.678305 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.678265 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196"} err="failed to get container status \"34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196\": rpc error: code = NotFound desc = could not find container \"34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196\": container with ID starting with 34723cd9c51b0f46515161f2dd1b16441b86d3b8a4ce4e472e679cd54e57d196 not found: ID does not exist" Apr 22 19:03:34.678305 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.678281 2573 scope.go:117] "RemoveContainer" containerID="ae9b9b74466cd2dcc48d1698944da7e6aff24f73a76f882b354246ffd1168b54" Apr 22 19:03:34.678575 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:03:34.678552 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9b9b74466cd2dcc48d1698944da7e6aff24f73a76f882b354246ffd1168b54\": container with ID starting with ae9b9b74466cd2dcc48d1698944da7e6aff24f73a76f882b354246ffd1168b54 not found: ID does not exist" containerID="ae9b9b74466cd2dcc48d1698944da7e6aff24f73a76f882b354246ffd1168b54" Apr 22 19:03:34.678575 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:34.678578 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9b9b74466cd2dcc48d1698944da7e6aff24f73a76f882b354246ffd1168b54"} err="failed to get container status \"ae9b9b74466cd2dcc48d1698944da7e6aff24f73a76f882b354246ffd1168b54\": rpc error: code = NotFound desc = could not find container \"ae9b9b74466cd2dcc48d1698944da7e6aff24f73a76f882b354246ffd1168b54\": container with ID starting with ae9b9b74466cd2dcc48d1698944da7e6aff24f73a76f882b354246ffd1168b54 not found: ID does not exist" Apr 22 19:03:35.364135 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:35.364100 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf931ff-735a-4729-96fa-fbc57d4428c0" path="/var/lib/kubelet/pods/fcf931ff-735a-4729-96fa-fbc57d4428c0/volumes" Apr 22 19:03:52.667844 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.667802 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr"] Apr 22 19:03:52.668398 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668294 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" containerName="main" Apr 22 19:03:52.668398 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668317 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" containerName="main" Apr 22 19:03:52.668398 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668334 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" containerName="storage-initializer" Apr 22 19:03:52.668398 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668343 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" containerName="storage-initializer" Apr 22 19:03:52.668398 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668358 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerName="storage-initializer" Apr 22 19:03:52.668398 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668365 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerName="storage-initializer" Apr 22 19:03:52.668398 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668378 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerName="tokenizer" Apr 22 19:03:52.668398 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668385 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerName="tokenizer" Apr 22 19:03:52.668398 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668398 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerName="tokenizer" Apr 22 19:03:52.668870 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668405 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerName="tokenizer" Apr 22 19:03:52.668870 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668419 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerName="storage-initializer" Apr 22 19:03:52.668870 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668427 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerName="storage-initializer" Apr 22 19:03:52.668870 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668440 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerName="main" Apr 22 19:03:52.668870 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668449 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerName="main" Apr 22 19:03:52.668870 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668460 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerName="main" Apr 22 19:03:52.668870 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668470 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerName="main" Apr 22 19:03:52.668870 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668552 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerName="tokenizer" Apr 22 19:03:52.668870 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668565 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fcf931ff-735a-4729-96fa-fbc57d4428c0" containerName="main" Apr 22 19:03:52.668870 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668577 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="884c537f-8d1a-4573-9b69-1eb9e1f8c3e9" containerName="main" Apr 22 19:03:52.668870 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668587 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerName="main" Apr 22 19:03:52.668870 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.668600 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd0a0692-d9e4-4848-a7df-452c680c6062" containerName="tokenizer" Apr 22 19:03:52.673253 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.673230 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.675844 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.675823 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mhhb\"" Apr 22 19:03:52.676974 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.676950 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-s2d8h\"" Apr 22 19:03:52.677092 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.676974 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:03:52.677092 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.676960 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 22 19:03:52.677092 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.677019 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 19:03:52.682675 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.682660 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr"] Apr 22 19:03:52.774319 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.774280 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.774319 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.774320 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrt28\" (UniqueName: \"kubernetes.io/projected/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-kube-api-access-lrt28\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.774523 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.774378 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.774523 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.774428 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.774523 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.774452 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.774523 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.774512 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.874896 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.874864 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrt28\" (UniqueName: \"kubernetes.io/projected/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-kube-api-access-lrt28\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.874896 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.874908 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.875121 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.874947 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.875121 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.874977 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.875121 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.875019 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.875121 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.875061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.875462 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.875443 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.875523 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.875468 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.875586 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.875566 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.875620 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.875566 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.877799 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.877778 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.884942 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.884910 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrt28\" (UniqueName: \"kubernetes.io/projected/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-kube-api-access-lrt28\") pod \"custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:52.985072 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:52.985002 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:53.109975 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:53.109943 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr"] Apr 22 19:03:53.112869 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:03:53.112837 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c8deda_cc2a_4ee8_b0f7_c45bcca55bf6.slice/crio-06146b3b8101d59adbab9a335047b6ad5303c2d4ed7b90dc50c5c23910f1b5a4 WatchSource:0}: Error finding container 06146b3b8101d59adbab9a335047b6ad5303c2d4ed7b90dc50c5c23910f1b5a4: Status 404 returned error can't find the container with id 06146b3b8101d59adbab9a335047b6ad5303c2d4ed7b90dc50c5c23910f1b5a4 Apr 22 19:03:53.114903 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:53.114885 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:03:53.728092 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:53.728055 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" event={"ID":"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6","Type":"ContainerStarted","Data":"b5c1443896b733354dc953ea7f6509c0e54e0240973f5d862c6fccac3aa1e9dd"} Apr 22 19:03:53.728092 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:53.728099 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" event={"ID":"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6","Type":"ContainerStarted","Data":"06146b3b8101d59adbab9a335047b6ad5303c2d4ed7b90dc50c5c23910f1b5a4"} Apr 22 19:03:54.733245 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:54.733212 2573 generic.go:358] "Generic (PLEG): container finished" podID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerID="b5c1443896b733354dc953ea7f6509c0e54e0240973f5d862c6fccac3aa1e9dd" exitCode=0 Apr 22 19:03:54.733604 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:54.733260 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" event={"ID":"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6","Type":"ContainerDied","Data":"b5c1443896b733354dc953ea7f6509c0e54e0240973f5d862c6fccac3aa1e9dd"} Apr 22 19:03:55.747827 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:55.747784 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" event={"ID":"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6","Type":"ContainerStarted","Data":"4de217aa0e5af8c6a74cb34838de64f6dde251d0d75ffd586f2233292454c33e"} Apr 22 19:03:55.747827 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:55.747831 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" event={"ID":"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6","Type":"ContainerStarted","Data":"b1f3d40fb4fd0797948609df6adc1862073bbc43dd703623c9cfda2abb889ac1"} Apr 22 19:03:55.748353 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:55.747963 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:03:55.768006 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:03:55.767947 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" podStartSLOduration=3.767929787 podStartE2EDuration="3.767929787s" podCreationTimestamp="2026-04-22 19:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:03:55.766109682 +0000 UTC m=+1031.006225781" watchObservedRunningTime="2026-04-22 19:03:55.767929787 +0000 UTC m=+1031.008045889" Apr 22 19:04:02.986182 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:04:02.986132 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:04:02.986678 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:04:02.986224 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:04:02.988871 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:04:02.988836 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:04:03.775823 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:04:03.775791 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:04:24.779663 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:04:24.779632 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:05:48.990008 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:48.989966 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr"] Apr 22 19:05:48.990479 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:48.990318 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" podUID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerName="main" containerID="cri-o://b1f3d40fb4fd0797948609df6adc1862073bbc43dd703623c9cfda2abb889ac1" gracePeriod=30 Apr 22 19:05:48.990479 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:48.990371 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" podUID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerName="tokenizer" containerID="cri-o://4de217aa0e5af8c6a74cb34838de64f6dde251d0d75ffd586f2233292454c33e" gracePeriod=30 Apr 22 19:05:49.154494 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:49.154458 2573 generic.go:358] "Generic (PLEG): container finished" podID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerID="b1f3d40fb4fd0797948609df6adc1862073bbc43dd703623c9cfda2abb889ac1" exitCode=0 Apr 22 19:05:49.154686 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:49.154500 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" event={"ID":"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6","Type":"ContainerDied","Data":"b1f3d40fb4fd0797948609df6adc1862073bbc43dd703623c9cfda2abb889ac1"} Apr 22 19:05:50.161236 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.161138 2573 generic.go:358] "Generic (PLEG): container finished" podID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerID="4de217aa0e5af8c6a74cb34838de64f6dde251d0d75ffd586f2233292454c33e" exitCode=0 Apr 22 19:05:50.161236 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.161210 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" event={"ID":"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6","Type":"ContainerDied","Data":"4de217aa0e5af8c6a74cb34838de64f6dde251d0d75ffd586f2233292454c33e"} Apr 22 19:05:50.239445 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.239425 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:05:50.324787 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.324722 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrt28\" (UniqueName: \"kubernetes.io/projected/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-kube-api-access-lrt28\") pod \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " Apr 22 19:05:50.324787 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.324752 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-cache\") pod \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " Apr 22 19:05:50.324787 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.324781 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-tmp\") pod \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " Apr 22 19:05:50.324980 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.324818 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tls-certs\") pod \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " Apr 22 19:05:50.324980 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.324838 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-kserve-provision-location\") pod \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " Apr 22 19:05:50.324980 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.324875 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-uds\") pod \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\" (UID: \"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6\") " Apr 22 19:05:50.325096 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.324983 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" (UID: "e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:05:50.325158 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.325132 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-cache\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:05:50.325244 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.325154 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" (UID: "e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:05:50.325244 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.325209 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" (UID: "e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:05:50.325676 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.325649 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" (UID: "e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:05:50.327070 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.327047 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-kube-api-access-lrt28" (OuterVolumeSpecName: "kube-api-access-lrt28") pod "e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" (UID: "e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6"). InnerVolumeSpecName "kube-api-access-lrt28". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:05:50.327155 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.327083 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" (UID: "e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:05:50.426117 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.426093 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tls-certs\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:05:50.426117 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.426115 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-kserve-provision-location\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:05:50.426275 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.426125 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-uds\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:05:50.426275 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.426133 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lrt28\" (UniqueName: \"kubernetes.io/projected/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-kube-api-access-lrt28\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:05:50.426275 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:50.426143 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6-tokenizer-tmp\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:05:51.167127 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:51.167089 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" event={"ID":"e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6","Type":"ContainerDied","Data":"06146b3b8101d59adbab9a335047b6ad5303c2d4ed7b90dc50c5c23910f1b5a4"} Apr 22 19:05:51.167625 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:51.167138 2573 scope.go:117] "RemoveContainer" containerID="4de217aa0e5af8c6a74cb34838de64f6dde251d0d75ffd586f2233292454c33e" Apr 22 19:05:51.167625 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:51.167185 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr" Apr 22 19:05:51.176344 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:51.176324 2573 scope.go:117] "RemoveContainer" containerID="b1f3d40fb4fd0797948609df6adc1862073bbc43dd703623c9cfda2abb889ac1" Apr 22 19:05:51.183748 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:51.183732 2573 scope.go:117] "RemoveContainer" containerID="b5c1443896b733354dc953ea7f6509c0e54e0240973f5d862c6fccac3aa1e9dd" Apr 22 19:05:51.188237 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:51.188208 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr"] Apr 22 19:05:51.191753 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:51.191734 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-785b9c44pgxqr"] Apr 22 19:05:51.364089 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:05:51.364061 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" path="/var/lib/kubelet/pods/e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6/volumes" Apr 22 19:06:02.307269 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.307232 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr"] Apr 22 19:06:02.307707 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.307593 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerName="tokenizer" Apr 22 19:06:02.307707 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.307605 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerName="tokenizer" Apr 22 19:06:02.307707 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.307624 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerName="storage-initializer" Apr 22 19:06:02.307707 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.307634 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerName="storage-initializer" Apr 22 19:06:02.307707 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.307643 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerName="main" Apr 22 19:06:02.307707 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.307649 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerName="main" Apr 22 19:06:02.307707 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.307697 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerName="tokenizer" Apr 22 19:06:02.307707 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.307709 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9c8deda-cc2a-4ee8-b0f7-c45bcca55bf6" containerName="main" Apr 22 19:06:02.312878 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.312854 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.315706 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.315671 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:06:02.315835 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.315749 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-x4kwq\"" Apr 22 19:06:02.316964 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.316895 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 22 19:06:02.317316 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.316913 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 19:06:02.317553 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.317032 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mhhb\"" Apr 22 19:06:02.319597 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.319574 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr"] Apr 22 19:06:02.422041 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.422006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl6fl\" (UniqueName: \"kubernetes.io/projected/a13b5406-db83-4209-b29f-489b6fc70ce1-kube-api-access-xl6fl\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.422212 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.422049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.422212 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.422105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.422212 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.422124 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.422361 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.422224 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.422361 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.422265 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b5406-db83-4209-b29f-489b6fc70ce1-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.523445 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.523406 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.523445 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.523447 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.523689 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.523510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.523689 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.523549 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b5406-db83-4209-b29f-489b6fc70ce1-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.523689 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.523579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl6fl\" (UniqueName: \"kubernetes.io/projected/a13b5406-db83-4209-b29f-489b6fc70ce1-kube-api-access-xl6fl\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.523872 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.523843 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.523924 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.523881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.523983 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.523934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.524114 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.524091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.524210 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.524155 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.526471 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.526446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b5406-db83-4209-b29f-489b6fc70ce1-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.531571 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.531546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl6fl\" (UniqueName: \"kubernetes.io/projected/a13b5406-db83-4209-b29f-489b6fc70ce1-kube-api-access-xl6fl\") pod \"router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.625228 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.625196 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:02.752159 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:02.752100 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr"] Apr 22 19:06:02.754581 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:06:02.754548 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda13b5406_db83_4209_b29f_489b6fc70ce1.slice/crio-9e061ce185dd1346bdc79a8097b5dc1e54ac64ec413f43726965807f114b2dfe WatchSource:0}: Error finding container 9e061ce185dd1346bdc79a8097b5dc1e54ac64ec413f43726965807f114b2dfe: Status 404 returned error can't find the container with id 9e061ce185dd1346bdc79a8097b5dc1e54ac64ec413f43726965807f114b2dfe Apr 22 19:06:03.211357 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:03.211318 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" event={"ID":"a13b5406-db83-4209-b29f-489b6fc70ce1","Type":"ContainerStarted","Data":"244c720ada9a81273b0191806f18f7f0226961aa78bc2645e7f9dd1bd624ad13"} Apr 22 19:06:03.211525 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:03.211365 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" event={"ID":"a13b5406-db83-4209-b29f-489b6fc70ce1","Type":"ContainerStarted","Data":"9e061ce185dd1346bdc79a8097b5dc1e54ac64ec413f43726965807f114b2dfe"} Apr 22 19:06:04.216125 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:04.216089 2573 generic.go:358] "Generic (PLEG): container finished" podID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerID="244c720ada9a81273b0191806f18f7f0226961aa78bc2645e7f9dd1bd624ad13" exitCode=0 Apr 22 19:06:04.216610 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:04.216202 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" event={"ID":"a13b5406-db83-4209-b29f-489b6fc70ce1","Type":"ContainerDied","Data":"244c720ada9a81273b0191806f18f7f0226961aa78bc2645e7f9dd1bd624ad13"} Apr 22 19:06:05.221655 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:05.221613 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" event={"ID":"a13b5406-db83-4209-b29f-489b6fc70ce1","Type":"ContainerStarted","Data":"78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3"} Apr 22 19:06:05.221655 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:05.221659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" event={"ID":"a13b5406-db83-4209-b29f-489b6fc70ce1","Type":"ContainerStarted","Data":"02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5"} Apr 22 19:06:05.222208 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:05.221761 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:05.242646 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:05.242599 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" podStartSLOduration=3.242586611 podStartE2EDuration="3.242586611s" podCreationTimestamp="2026-04-22 19:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:05.239921896 +0000 UTC m=+1160.480038000" watchObservedRunningTime="2026-04-22 19:06:05.242586611 +0000 UTC m=+1160.482702712" Apr 22 19:06:12.625735 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:12.625700 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:12.625735 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:12.625740 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:12.628350 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:12.628324 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:13.257718 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:13.257685 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:20.100665 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:20.100630 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-74f8bc794f-ng6xl"] Apr 22 19:06:20.101501 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:20.101447 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" podUID="d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c" containerName="manager" containerID="cri-o://85a642165f204157f5646f999372a88d0d9712cf29b2a18d2768f2ec312a56ed" gracePeriod=30 Apr 22 19:06:25.296261 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:25.296229 2573 generic.go:358] "Generic (PLEG): container finished" podID="d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c" containerID="85a642165f204157f5646f999372a88d0d9712cf29b2a18d2768f2ec312a56ed" exitCode=0 Apr 22 19:06:25.296723 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:25.296299 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" event={"ID":"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c","Type":"ContainerDied","Data":"85a642165f204157f5646f999372a88d0d9712cf29b2a18d2768f2ec312a56ed"} Apr 22 19:06:25.338751 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:25.338727 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" Apr 22 19:06:25.412484 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:25.412455 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c-cert\") pod \"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c\" (UID: \"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c\") " Apr 22 19:06:25.412663 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:25.412516 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gwhv\" (UniqueName: \"kubernetes.io/projected/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c-kube-api-access-9gwhv\") pod \"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c\" (UID: \"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c\") " Apr 22 19:06:25.414877 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:25.414845 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c-cert" (OuterVolumeSpecName: "cert") pod "d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c" (UID: "d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:25.414992 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:25.414879 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c-kube-api-access-9gwhv" (OuterVolumeSpecName: "kube-api-access-9gwhv") pod "d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c" (UID: "d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c"). InnerVolumeSpecName "kube-api-access-9gwhv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:25.513497 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:25.513419 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9gwhv\" (UniqueName: \"kubernetes.io/projected/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c-kube-api-access-9gwhv\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:06:25.513497 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:25.513447 2573 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c-cert\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:06:26.301073 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:26.301045 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" Apr 22 19:06:26.301483 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:26.301071 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-74f8bc794f-ng6xl" event={"ID":"d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c","Type":"ContainerDied","Data":"dd321f73edf11399383db2b590f10938dd374b3049cedf06e8c919b01931e6fe"} Apr 22 19:06:26.301483 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:26.301116 2573 scope.go:117] "RemoveContainer" containerID="85a642165f204157f5646f999372a88d0d9712cf29b2a18d2768f2ec312a56ed" Apr 22 19:06:26.321048 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:26.321022 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-74f8bc794f-ng6xl"] Apr 22 19:06:26.323850 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:26.323827 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-74f8bc794f-ng6xl"] Apr 22 19:06:27.367184 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:27.367146 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c" path="/var/lib/kubelet/pods/d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c/volumes" Apr 22 19:06:34.266288 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:34.266259 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:06:45.360800 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:45.360771 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 19:06:45.363494 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:06:45.363471 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 19:07:31.457486 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.457456 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm"] Apr 22 19:07:31.459928 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.457869 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c" containerName="manager" Apr 22 19:07:31.459928 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.457902 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c" containerName="manager" Apr 22 19:07:31.459928 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.458066 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9311ef2-e4cd-4cf7-bee6-326c98b3ef2c" containerName="manager" Apr 22 19:07:31.460900 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.460886 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.463634 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.463612 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-4dv6t\"" Apr 22 19:07:31.464006 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.463989 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 19:07:31.473048 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.473025 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm"] Apr 22 19:07:31.523513 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.523478 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.523662 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.523528 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgnmt\" (UniqueName: \"kubernetes.io/projected/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-kube-api-access-zgnmt\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.523662 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.523622 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.523774 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.523670 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.523774 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.523728 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.523869 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.523791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.624567 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.624528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.624749 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.624570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.624749 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.624626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgnmt\" (UniqueName: \"kubernetes.io/projected/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-kube-api-access-zgnmt\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.624749 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.624672 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.624916 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.624797 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.624916 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.624841 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.625024 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.624924 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.625024 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.624965 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.625122 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.625104 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.625158 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.625129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.627244 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.627221 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.633145 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.633122 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgnmt\" (UniqueName: \"kubernetes.io/projected/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-kube-api-access-zgnmt\") pod \"router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.770435 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.770349 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:31.897354 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:31.897330 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm"] Apr 22 19:07:31.899391 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:07:31.899363 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf026162_7b9e_44d3_a32f_2a9fd9ecd4e9.slice/crio-e4af67cee9e0c4ebe42c86a57246c09e06895091762156cfab44d2e60e173a3f WatchSource:0}: Error finding container e4af67cee9e0c4ebe42c86a57246c09e06895091762156cfab44d2e60e173a3f: Status 404 returned error can't find the container with id e4af67cee9e0c4ebe42c86a57246c09e06895091762156cfab44d2e60e173a3f Apr 22 19:07:32.523420 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:32.523381 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" event={"ID":"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9","Type":"ContainerStarted","Data":"31b9cab9bd5ca1e91e21059b3f409ce6ec0894f9fbe1a2d0b3a564eb04cf19be"} Apr 22 19:07:32.523420 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:32.523422 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" event={"ID":"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9","Type":"ContainerStarted","Data":"e4af67cee9e0c4ebe42c86a57246c09e06895091762156cfab44d2e60e173a3f"} Apr 22 19:07:33.528103 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:33.528069 2573 generic.go:358] "Generic (PLEG): container finished" podID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerID="31b9cab9bd5ca1e91e21059b3f409ce6ec0894f9fbe1a2d0b3a564eb04cf19be" exitCode=0 Apr 22 19:07:33.528505 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:33.528135 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" event={"ID":"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9","Type":"ContainerDied","Data":"31b9cab9bd5ca1e91e21059b3f409ce6ec0894f9fbe1a2d0b3a564eb04cf19be"} Apr 22 19:07:34.533603 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:34.533564 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" event={"ID":"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9","Type":"ContainerStarted","Data":"43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588"} Apr 22 19:07:34.533982 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:34.533612 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" event={"ID":"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9","Type":"ContainerStarted","Data":"29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31"} Apr 22 19:07:34.533982 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:34.533710 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:34.553085 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:34.553036 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" podStartSLOduration=3.553022691 podStartE2EDuration="3.553022691s" podCreationTimestamp="2026-04-22 19:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:07:34.551157178 +0000 UTC m=+1249.791273280" watchObservedRunningTime="2026-04-22 19:07:34.553022691 +0000 UTC m=+1249.793138794" Apr 22 19:07:41.771279 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:41.771239 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:41.771279 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:41.771285 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:41.773868 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:41.773838 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:42.564739 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:42.564709 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:07:48.590266 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:48.590230 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr"] Apr 22 19:07:48.590682 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:48.590611 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" podUID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerName="main" containerID="cri-o://02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5" gracePeriod=30 Apr 22 19:07:48.590752 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:48.590668 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" podUID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerName="tokenizer" containerID="cri-o://78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3" gracePeriod=30 Apr 22 19:07:49.587465 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.587432 2573 generic.go:358] "Generic (PLEG): container finished" podID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerID="02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5" exitCode=0 Apr 22 19:07:49.587588 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.587504 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" event={"ID":"a13b5406-db83-4209-b29f-489b6fc70ce1","Type":"ContainerDied","Data":"02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5"} Apr 22 19:07:49.749196 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.749155 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:07:49.853727 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.853632 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl6fl\" (UniqueName: \"kubernetes.io/projected/a13b5406-db83-4209-b29f-489b6fc70ce1-kube-api-access-xl6fl\") pod \"a13b5406-db83-4209-b29f-489b6fc70ce1\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " Apr 22 19:07:49.853727 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.853666 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-kserve-provision-location\") pod \"a13b5406-db83-4209-b29f-489b6fc70ce1\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " Apr 22 19:07:49.853727 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.853695 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-tmp\") pod \"a13b5406-db83-4209-b29f-489b6fc70ce1\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " Apr 22 19:07:49.853727 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.853717 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b5406-db83-4209-b29f-489b6fc70ce1-tls-certs\") pod \"a13b5406-db83-4209-b29f-489b6fc70ce1\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " Apr 22 19:07:49.854069 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.853751 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-cache\") pod \"a13b5406-db83-4209-b29f-489b6fc70ce1\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " Apr 22 19:07:49.854069 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.853767 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-uds\") pod \"a13b5406-db83-4209-b29f-489b6fc70ce1\" (UID: \"a13b5406-db83-4209-b29f-489b6fc70ce1\") " Apr 22 19:07:49.854199 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.854090 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a13b5406-db83-4209-b29f-489b6fc70ce1" (UID: "a13b5406-db83-4209-b29f-489b6fc70ce1"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:49.854199 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.854110 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a13b5406-db83-4209-b29f-489b6fc70ce1" (UID: "a13b5406-db83-4209-b29f-489b6fc70ce1"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:49.854199 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.854148 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a13b5406-db83-4209-b29f-489b6fc70ce1" (UID: "a13b5406-db83-4209-b29f-489b6fc70ce1"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:49.854584 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.854557 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a13b5406-db83-4209-b29f-489b6fc70ce1" (UID: "a13b5406-db83-4209-b29f-489b6fc70ce1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:49.855959 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.855936 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13b5406-db83-4209-b29f-489b6fc70ce1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a13b5406-db83-4209-b29f-489b6fc70ce1" (UID: "a13b5406-db83-4209-b29f-489b6fc70ce1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:07:49.856013 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.855985 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13b5406-db83-4209-b29f-489b6fc70ce1-kube-api-access-xl6fl" (OuterVolumeSpecName: "kube-api-access-xl6fl") pod "a13b5406-db83-4209-b29f-489b6fc70ce1" (UID: "a13b5406-db83-4209-b29f-489b6fc70ce1"). InnerVolumeSpecName "kube-api-access-xl6fl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:07:49.954745 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.954713 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-tmp\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:07:49.954745 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.954741 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b5406-db83-4209-b29f-489b6fc70ce1-tls-certs\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:07:49.954745 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.954750 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-cache\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:07:49.954965 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.954758 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-tokenizer-uds\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:07:49.954965 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.954768 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xl6fl\" (UniqueName: \"kubernetes.io/projected/a13b5406-db83-4209-b29f-489b6fc70ce1-kube-api-access-xl6fl\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:07:49.954965 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:49.954777 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a13b5406-db83-4209-b29f-489b6fc70ce1-kserve-provision-location\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:07:50.591884 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.591846 2573 generic.go:358] "Generic (PLEG): container finished" podID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerID="78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3" exitCode=0 Apr 22 19:07:50.592075 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.591912 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" event={"ID":"a13b5406-db83-4209-b29f-489b6fc70ce1","Type":"ContainerDied","Data":"78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3"} Apr 22 19:07:50.592075 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.591935 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" Apr 22 19:07:50.592075 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.591948 2573 scope.go:117] "RemoveContainer" containerID="78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3" Apr 22 19:07:50.592075 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.591939 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr" event={"ID":"a13b5406-db83-4209-b29f-489b6fc70ce1","Type":"ContainerDied","Data":"9e061ce185dd1346bdc79a8097b5dc1e54ac64ec413f43726965807f114b2dfe"} Apr 22 19:07:50.600112 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.600083 2573 scope.go:117] "RemoveContainer" containerID="02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5" Apr 22 19:07:50.607218 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.607200 2573 scope.go:117] "RemoveContainer" containerID="244c720ada9a81273b0191806f18f7f0226961aa78bc2645e7f9dd1bd624ad13" Apr 22 19:07:50.612585 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.612564 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr"] Apr 22 19:07:50.615658 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.615637 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-587c4b5d79-b7qxr"] Apr 22 19:07:50.615842 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.615826 2573 scope.go:117] "RemoveContainer" containerID="78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3" Apr 22 19:07:50.616117 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:07:50.616099 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3\": container with ID starting with 78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3 not found: ID does not exist" containerID="78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3" Apr 22 19:07:50.616200 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.616134 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3"} err="failed to get container status \"78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3\": rpc error: code = NotFound desc = could not find container \"78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3\": container with ID starting with 78c5a38383afa54491ed435043157ab4114da6c54990aee4d4fa69013a0854f3 not found: ID does not exist" Apr 22 19:07:50.616200 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.616152 2573 scope.go:117] "RemoveContainer" containerID="02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5" Apr 22 19:07:50.616494 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:07:50.616472 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5\": container with ID starting with 02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5 not found: ID does not exist" containerID="02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5" Apr 22 19:07:50.616539 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.616507 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5"} err="failed to get container status \"02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5\": rpc error: code = NotFound desc = could not find container \"02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5\": container with ID starting with 02565d4314e8ec123524c466d92f7b08bd2f9c593ab8fc28cd79c09ee7c369f5 not found: ID does not exist" Apr 22 19:07:50.616539 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.616531 2573 scope.go:117] "RemoveContainer" containerID="244c720ada9a81273b0191806f18f7f0226961aa78bc2645e7f9dd1bd624ad13" Apr 22 19:07:50.616736 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:07:50.616720 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244c720ada9a81273b0191806f18f7f0226961aa78bc2645e7f9dd1bd624ad13\": container with ID starting with 244c720ada9a81273b0191806f18f7f0226961aa78bc2645e7f9dd1bd624ad13 not found: ID does not exist" containerID="244c720ada9a81273b0191806f18f7f0226961aa78bc2645e7f9dd1bd624ad13" Apr 22 19:07:50.616772 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:50.616742 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244c720ada9a81273b0191806f18f7f0226961aa78bc2645e7f9dd1bd624ad13"} err="failed to get container status \"244c720ada9a81273b0191806f18f7f0226961aa78bc2645e7f9dd1bd624ad13\": rpc error: code = NotFound desc = could not find container \"244c720ada9a81273b0191806f18f7f0226961aa78bc2645e7f9dd1bd624ad13\": container with ID starting with 244c720ada9a81273b0191806f18f7f0226961aa78bc2645e7f9dd1bd624ad13 not found: ID does not exist" Apr 22 19:07:51.364196 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:07:51.364130 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13b5406-db83-4209-b29f-489b6fc70ce1" path="/var/lib/kubelet/pods/a13b5406-db83-4209-b29f-489b6fc70ce1/volumes" Apr 22 19:08:03.568876 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:03.568847 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:08:04.345934 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.345886 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8"] Apr 22 19:08:04.346252 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.346237 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerName="main" Apr 22 19:08:04.346252 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.346252 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerName="main" Apr 22 19:08:04.346424 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.346266 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerName="tokenizer" Apr 22 19:08:04.346424 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.346273 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerName="tokenizer" Apr 22 19:08:04.346424 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.346288 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerName="storage-initializer" Apr 22 19:08:04.346424 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.346294 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerName="storage-initializer" Apr 22 19:08:04.346424 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.346348 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerName="main" Apr 22 19:08:04.346424 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.346356 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a13b5406-db83-4209-b29f-489b6fc70ce1" containerName="tokenizer" Apr 22 19:08:04.351477 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.351453 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.353988 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.353962 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-z8rxs\"" Apr 22 19:08:04.354115 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.354045 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 19:08:04.358432 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.358405 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8"] Apr 22 19:08:04.472458 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.472423 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.472640 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.472466 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxvl7\" (UniqueName: \"kubernetes.io/projected/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-kube-api-access-lxvl7\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.472640 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.472489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.472640 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.472539 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.472640 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.472597 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.472640 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.472631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.573482 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.573440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.573482 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.573487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxvl7\" (UniqueName: \"kubernetes.io/projected/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-kube-api-access-lxvl7\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.573973 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.573510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.573973 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.573527 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.573973 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.573555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.573973 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.573674 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.574114 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.573986 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.574114 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.574045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.574114 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.574085 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.574271 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.574133 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.576511 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.576494 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.581470 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.581449 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxvl7\" (UniqueName: \"kubernetes.io/projected/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-kube-api-access-lxvl7\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.663880 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.663852 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:04.796499 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:04.796474 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8"] Apr 22 19:08:04.798876 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:08:04.798842 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7ae567c_2446_4dd7_85c8_92fedd7d1be2.slice/crio-ea8bddf48bfac8035cbcd44133305367a642bb62f4f2edd582953a0ba287615a WatchSource:0}: Error finding container ea8bddf48bfac8035cbcd44133305367a642bb62f4f2edd582953a0ba287615a: Status 404 returned error can't find the container with id ea8bddf48bfac8035cbcd44133305367a642bb62f4f2edd582953a0ba287615a Apr 22 19:08:05.643647 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:05.643618 2573 generic.go:358] "Generic (PLEG): container finished" podID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerID="5a86b7aa861713d69fa77d83a193ce5a0a19bad5dcb6fb2959d600452fdc35c1" exitCode=0 Apr 22 19:08:05.644008 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:05.643706 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" event={"ID":"f7ae567c-2446-4dd7-85c8-92fedd7d1be2","Type":"ContainerDied","Data":"5a86b7aa861713d69fa77d83a193ce5a0a19bad5dcb6fb2959d600452fdc35c1"} Apr 22 19:08:05.644008 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:05.643753 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" event={"ID":"f7ae567c-2446-4dd7-85c8-92fedd7d1be2","Type":"ContainerStarted","Data":"ea8bddf48bfac8035cbcd44133305367a642bb62f4f2edd582953a0ba287615a"} Apr 22 19:08:06.650214 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:06.650148 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" event={"ID":"f7ae567c-2446-4dd7-85c8-92fedd7d1be2","Type":"ContainerStarted","Data":"76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4"} Apr 22 19:08:06.650214 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:06.650211 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" event={"ID":"f7ae567c-2446-4dd7-85c8-92fedd7d1be2","Type":"ContainerStarted","Data":"a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041"} Apr 22 19:08:06.650652 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:06.650312 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:06.669957 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:06.669896 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" podStartSLOduration=2.669880691 podStartE2EDuration="2.669880691s" podCreationTimestamp="2026-04-22 19:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:08:06.668088768 +0000 UTC m=+1281.908204873" watchObservedRunningTime="2026-04-22 19:08:06.669880691 +0000 UTC m=+1281.909996793" Apr 22 19:08:14.664526 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:14.664491 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:14.664526 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:14.664536 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:14.667613 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:14.667588 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:14.677740 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:14.677716 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:08:35.681116 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:08:35.681082 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:09:49.298993 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:49.298952 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm"] Apr 22 19:09:49.299520 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:49.299304 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" podUID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerName="main" containerID="cri-o://29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31" gracePeriod=30 Apr 22 19:09:49.299520 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:49.299346 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" podUID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerName="tokenizer" containerID="cri-o://43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588" gracePeriod=30 Apr 22 19:09:50.019496 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.019456 2573 generic.go:358] "Generic (PLEG): container finished" podID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerID="29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31" exitCode=0 Apr 22 19:09:50.019496 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.019499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" event={"ID":"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9","Type":"ContainerDied","Data":"29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31"} Apr 22 19:09:50.540935 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.540913 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:09:50.556995 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.556973 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-kserve-provision-location\") pod \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " Apr 22 19:09:50.557108 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.557019 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-tmp\") pod \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " Apr 22 19:09:50.557108 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.557044 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-cache\") pod \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " Apr 22 19:09:50.557108 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.557091 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-uds\") pod \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " Apr 22 19:09:50.557283 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.557117 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tls-certs\") pod \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " Apr 22 19:09:50.557283 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.557135 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgnmt\" (UniqueName: \"kubernetes.io/projected/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-kube-api-access-zgnmt\") pod \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\" (UID: \"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9\") " Apr 22 19:09:50.557432 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.557409 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" (UID: "cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:50.557488 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.557423 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" (UID: "cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:50.557488 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.557425 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" (UID: "cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:50.558081 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.558009 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" (UID: "cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:50.559805 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.559768 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" (UID: "cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:09:50.561575 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.561552 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-kube-api-access-zgnmt" (OuterVolumeSpecName: "kube-api-access-zgnmt") pod "cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" (UID: "cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9"). InnerVolumeSpecName "kube-api-access-zgnmt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:09:50.658245 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.658199 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-kserve-provision-location\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:09:50.658245 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.658238 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-tmp\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:09:50.658245 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.658251 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-cache\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:09:50.658497 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.658262 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tokenizer-uds\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:09:50.658497 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.658274 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-tls-certs\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:09:50.658497 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:50.658289 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zgnmt\" (UniqueName: \"kubernetes.io/projected/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9-kube-api-access-zgnmt\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:09:51.025391 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.025307 2573 generic.go:358] "Generic (PLEG): container finished" podID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerID="43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588" exitCode=0 Apr 22 19:09:51.025559 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.025399 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" Apr 22 19:09:51.025559 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.025398 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" event={"ID":"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9","Type":"ContainerDied","Data":"43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588"} Apr 22 19:09:51.025559 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.025442 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm" event={"ID":"cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9","Type":"ContainerDied","Data":"e4af67cee9e0c4ebe42c86a57246c09e06895091762156cfab44d2e60e173a3f"} Apr 22 19:09:51.025559 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.025462 2573 scope.go:117] "RemoveContainer" containerID="43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588" Apr 22 19:09:51.034536 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.034518 2573 scope.go:117] "RemoveContainer" containerID="29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31" Apr 22 19:09:51.041591 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.041573 2573 scope.go:117] "RemoveContainer" containerID="31b9cab9bd5ca1e91e21059b3f409ce6ec0894f9fbe1a2d0b3a564eb04cf19be" Apr 22 19:09:51.049140 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.049122 2573 scope.go:117] "RemoveContainer" containerID="43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588" Apr 22 19:09:51.049456 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:09:51.049433 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588\": container with ID starting with 43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588 not found: ID does not exist" containerID="43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588" Apr 22 19:09:51.049537 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.049469 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588"} err="failed to get container status \"43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588\": rpc error: code = NotFound desc = could not find container \"43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588\": container with ID starting with 43aef93c5c3ef238b0e2017e362b68d0111a114dfe4ccc2c6113d32f09d11588 not found: ID does not exist" Apr 22 19:09:51.049537 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.049489 2573 scope.go:117] "RemoveContainer" containerID="29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31" Apr 22 19:09:51.049656 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.049534 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm"] Apr 22 19:09:51.049764 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:09:51.049744 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31\": container with ID starting with 29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31 not found: ID does not exist" containerID="29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31" Apr 22 19:09:51.049832 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.049783 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31"} err="failed to get container status \"29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31\": rpc error: code = NotFound desc = could not find container \"29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31\": container with ID starting with 29cf9e09df93fcb10fba4197418a6a37c86de116d4a42da13a972c99f7c6cd31 not found: ID does not exist" Apr 22 19:09:51.049832 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.049799 2573 scope.go:117] "RemoveContainer" containerID="31b9cab9bd5ca1e91e21059b3f409ce6ec0894f9fbe1a2d0b3a564eb04cf19be" Apr 22 19:09:51.050063 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:09:51.050040 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b9cab9bd5ca1e91e21059b3f409ce6ec0894f9fbe1a2d0b3a564eb04cf19be\": container with ID starting with 31b9cab9bd5ca1e91e21059b3f409ce6ec0894f9fbe1a2d0b3a564eb04cf19be not found: ID does not exist" containerID="31b9cab9bd5ca1e91e21059b3f409ce6ec0894f9fbe1a2d0b3a564eb04cf19be" Apr 22 19:09:51.050118 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.050080 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b9cab9bd5ca1e91e21059b3f409ce6ec0894f9fbe1a2d0b3a564eb04cf19be"} err="failed to get container status \"31b9cab9bd5ca1e91e21059b3f409ce6ec0894f9fbe1a2d0b3a564eb04cf19be\": rpc error: code = NotFound desc = could not find container \"31b9cab9bd5ca1e91e21059b3f409ce6ec0894f9fbe1a2d0b3a564eb04cf19be\": container with ID starting with 31b9cab9bd5ca1e91e21059b3f409ce6ec0894f9fbe1a2d0b3a564eb04cf19be not found: ID does not exist" Apr 22 19:09:51.052774 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.052752 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-857774975vdnwm"] Apr 22 19:09:51.366520 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:09:51.366425 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" path="/var/lib/kubelet/pods/cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9/volumes" Apr 22 19:10:25.236625 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:25.236594 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8"] Apr 22 19:10:25.237079 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:25.236953 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" podUID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerName="main" containerID="cri-o://a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041" gracePeriod=30 Apr 22 19:10:25.237079 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:25.236999 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" podUID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerName="tokenizer" containerID="cri-o://76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4" gracePeriod=30 Apr 22 19:10:25.680385 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:10:25.680358 2573 logging.go:55] [core] [Channel #414 SubChannel #415]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.33:9003", ServerName: "10.133.0.33:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.33:9003: connect: connection refused" Apr 22 19:10:26.141157 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.141120 2573 generic.go:358] "Generic (PLEG): container finished" podID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerID="a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041" exitCode=0 Apr 22 19:10:26.141354 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.141194 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" event={"ID":"f7ae567c-2446-4dd7-85c8-92fedd7d1be2","Type":"ContainerDied","Data":"a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041"} Apr 22 19:10:26.480912 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.480887 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:10:26.554962 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.554931 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-cache\") pod \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " Apr 22 19:10:26.555147 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.554968 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxvl7\" (UniqueName: \"kubernetes.io/projected/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-kube-api-access-lxvl7\") pod \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " Apr 22 19:10:26.555147 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.554984 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-tmp\") pod \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " Apr 22 19:10:26.555147 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.555034 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-uds\") pod \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " Apr 22 19:10:26.555412 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.555227 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tls-certs\") pod \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " Apr 22 19:10:26.555412 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.555284 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f7ae567c-2446-4dd7-85c8-92fedd7d1be2" (UID: "f7ae567c-2446-4dd7-85c8-92fedd7d1be2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:26.555412 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.555298 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f7ae567c-2446-4dd7-85c8-92fedd7d1be2" (UID: "f7ae567c-2446-4dd7-85c8-92fedd7d1be2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:26.555412 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.555306 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-kserve-provision-location\") pod \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\" (UID: \"f7ae567c-2446-4dd7-85c8-92fedd7d1be2\") " Apr 22 19:10:26.555621 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.555438 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f7ae567c-2446-4dd7-85c8-92fedd7d1be2" (UID: "f7ae567c-2446-4dd7-85c8-92fedd7d1be2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:26.555621 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.555608 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-cache\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:10:26.555621 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.555622 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-tmp\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:10:26.555773 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.555631 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tokenizer-uds\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:10:26.556026 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.555998 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f7ae567c-2446-4dd7-85c8-92fedd7d1be2" (UID: "f7ae567c-2446-4dd7-85c8-92fedd7d1be2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:26.557416 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.557396 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-kube-api-access-lxvl7" (OuterVolumeSpecName: "kube-api-access-lxvl7") pod "f7ae567c-2446-4dd7-85c8-92fedd7d1be2" (UID: "f7ae567c-2446-4dd7-85c8-92fedd7d1be2"). InnerVolumeSpecName "kube-api-access-lxvl7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:10:26.557483 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.557432 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f7ae567c-2446-4dd7-85c8-92fedd7d1be2" (UID: "f7ae567c-2446-4dd7-85c8-92fedd7d1be2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:26.657035 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.656958 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lxvl7\" (UniqueName: \"kubernetes.io/projected/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-kube-api-access-lxvl7\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:10:26.657035 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.656986 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-tls-certs\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:10:26.657035 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.656998 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7ae567c-2446-4dd7-85c8-92fedd7d1be2-kserve-provision-location\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:10:26.680989 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:26.680954 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" podUID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.33:9003\" within 1s: context deadline exceeded" Apr 22 19:10:27.146182 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.146118 2573 generic.go:358] "Generic (PLEG): container finished" podID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerID="76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4" exitCode=0 Apr 22 19:10:27.146351 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.146197 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" event={"ID":"f7ae567c-2446-4dd7-85c8-92fedd7d1be2","Type":"ContainerDied","Data":"76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4"} Apr 22 19:10:27.146351 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.146220 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" Apr 22 19:10:27.146351 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.146237 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8" event={"ID":"f7ae567c-2446-4dd7-85c8-92fedd7d1be2","Type":"ContainerDied","Data":"ea8bddf48bfac8035cbcd44133305367a642bb62f4f2edd582953a0ba287615a"} Apr 22 19:10:27.146351 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.146254 2573 scope.go:117] "RemoveContainer" containerID="76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4" Apr 22 19:10:27.154800 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.154782 2573 scope.go:117] "RemoveContainer" containerID="a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041" Apr 22 19:10:27.161872 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.161854 2573 scope.go:117] "RemoveContainer" containerID="5a86b7aa861713d69fa77d83a193ce5a0a19bad5dcb6fb2959d600452fdc35c1" Apr 22 19:10:27.168257 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.168236 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8"] Apr 22 19:10:27.168979 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.168963 2573 scope.go:117] "RemoveContainer" containerID="76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4" Apr 22 19:10:27.169237 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:10:27.169218 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4\": container with ID starting with 76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4 not found: ID does not exist" containerID="76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4" Apr 22 19:10:27.169307 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.169244 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4"} err="failed to get container status \"76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4\": rpc error: code = NotFound desc = could not find container \"76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4\": container with ID starting with 76018c635b9e8ec2bae48bc4037257b55f8db4a1f93d3840a97215e6ca9f0bc4 not found: ID does not exist" Apr 22 19:10:27.169307 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.169261 2573 scope.go:117] "RemoveContainer" containerID="a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041" Apr 22 19:10:27.169497 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:10:27.169481 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041\": container with ID starting with a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041 not found: ID does not exist" containerID="a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041" Apr 22 19:10:27.169540 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.169502 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041"} err="failed to get container status \"a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041\": rpc error: code = NotFound desc = could not find container \"a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041\": container with ID starting with a33500a1dd533aa6da1991b7da25d3a70637710c975241da4078106427326041 not found: ID does not exist" Apr 22 19:10:27.169540 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.169519 2573 scope.go:117] "RemoveContainer" containerID="5a86b7aa861713d69fa77d83a193ce5a0a19bad5dcb6fb2959d600452fdc35c1" Apr 22 19:10:27.169725 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:10:27.169707 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a86b7aa861713d69fa77d83a193ce5a0a19bad5dcb6fb2959d600452fdc35c1\": container with ID starting with 5a86b7aa861713d69fa77d83a193ce5a0a19bad5dcb6fb2959d600452fdc35c1 not found: ID does not exist" containerID="5a86b7aa861713d69fa77d83a193ce5a0a19bad5dcb6fb2959d600452fdc35c1" Apr 22 19:10:27.169779 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.169735 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a86b7aa861713d69fa77d83a193ce5a0a19bad5dcb6fb2959d600452fdc35c1"} err="failed to get container status \"5a86b7aa861713d69fa77d83a193ce5a0a19bad5dcb6fb2959d600452fdc35c1\": rpc error: code = NotFound desc = could not find container \"5a86b7aa861713d69fa77d83a193ce5a0a19bad5dcb6fb2959d600452fdc35c1\": container with ID starting with 5a86b7aa861713d69fa77d83a193ce5a0a19bad5dcb6fb2959d600452fdc35c1 not found: ID does not exist" Apr 22 19:10:27.173663 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.173642 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche7j7b8"] Apr 22 19:10:27.363640 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:27.363605 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" path="/var/lib/kubelet/pods/f7ae567c-2446-4dd7-85c8-92fedd7d1be2/volumes" Apr 22 19:10:29.637839 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.637796 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm"] Apr 22 19:10:29.638214 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638127 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerName="tokenizer" Apr 22 19:10:29.638214 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638139 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerName="tokenizer" Apr 22 19:10:29.638214 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638157 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerName="storage-initializer" Apr 22 19:10:29.638214 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638190 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerName="storage-initializer" Apr 22 19:10:29.638214 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638201 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerName="storage-initializer" Apr 22 19:10:29.638214 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638207 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerName="storage-initializer" Apr 22 19:10:29.638214 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638212 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerName="tokenizer" Apr 22 19:10:29.638214 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638217 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerName="tokenizer" Apr 22 19:10:29.638494 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638227 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerName="main" Apr 22 19:10:29.638494 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638232 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerName="main" Apr 22 19:10:29.638494 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638242 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerName="main" Apr 22 19:10:29.638494 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638248 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerName="main" Apr 22 19:10:29.638494 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638304 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerName="main" Apr 22 19:10:29.638494 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638314 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerName="tokenizer" Apr 22 19:10:29.638494 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638321 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7ae567c-2446-4dd7-85c8-92fedd7d1be2" containerName="tokenizer" Apr 22 19:10:29.638494 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.638328 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf026162-7b9e-44d3-a32f-2a9fd9ecd4e9" containerName="main" Apr 22 19:10:29.643518 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.643498 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.646123 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.646100 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 19:10:29.646268 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.646185 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mhhb\"" Apr 22 19:10:29.647131 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.647112 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:10:29.647309 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.647217 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-kpjlm\"" Apr 22 19:10:29.647367 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.647336 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 19:10:29.651079 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.651052 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm"] Apr 22 19:10:29.783970 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.783935 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.783970 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.783973 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d81db7c5-136c-426d-95e0-80ccd5b0f807-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.784231 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.783992 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.784231 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.784016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.784231 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.784065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hstx8\" (UniqueName: \"kubernetes.io/projected/d81db7c5-136c-426d-95e0-80ccd5b0f807-kube-api-access-hstx8\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.784231 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.784122 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.885333 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.885303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.885469 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.885340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d81db7c5-136c-426d-95e0-80ccd5b0f807-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.885469 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.885370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.885469 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.885403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.885469 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.885432 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hstx8\" (UniqueName: \"kubernetes.io/projected/d81db7c5-136c-426d-95e0-80ccd5b0f807-kube-api-access-hstx8\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.885685 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.885479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.885790 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.885768 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.885872 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.885790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.885872 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.885840 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.885872 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.885848 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.888017 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.887974 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d81db7c5-136c-426d-95e0-80ccd5b0f807-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.892483 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.892462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hstx8\" (UniqueName: \"kubernetes.io/projected/d81db7c5-136c-426d-95e0-80ccd5b0f807-kube-api-access-hstx8\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:29.954492 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:29.954466 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:30.079143 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:30.079114 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm"] Apr 22 19:10:30.083359 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:10:30.083326 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd81db7c5_136c_426d_95e0_80ccd5b0f807.slice/crio-e6a57231ea1b6f814456b2c6f3dae29f5e97837b0fad831c4fa5136e08ac6143 WatchSource:0}: Error finding container e6a57231ea1b6f814456b2c6f3dae29f5e97837b0fad831c4fa5136e08ac6143: Status 404 returned error can't find the container with id e6a57231ea1b6f814456b2c6f3dae29f5e97837b0fad831c4fa5136e08ac6143 Apr 22 19:10:30.085656 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:30.085637 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:10:30.158588 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:30.158557 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" event={"ID":"d81db7c5-136c-426d-95e0-80ccd5b0f807","Type":"ContainerStarted","Data":"34d3baffe18c5c06bfc2bb9eafed9a786f36070f248cb09e53f749fd9e6e456a"} Apr 22 19:10:30.158725 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:30.158596 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" event={"ID":"d81db7c5-136c-426d-95e0-80ccd5b0f807","Type":"ContainerStarted","Data":"e6a57231ea1b6f814456b2c6f3dae29f5e97837b0fad831c4fa5136e08ac6143"} Apr 22 19:10:31.164110 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:31.164075 2573 generic.go:358] "Generic (PLEG): container finished" podID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerID="34d3baffe18c5c06bfc2bb9eafed9a786f36070f248cb09e53f749fd9e6e456a" exitCode=0 Apr 22 19:10:31.164618 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:31.164142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" event={"ID":"d81db7c5-136c-426d-95e0-80ccd5b0f807","Type":"ContainerDied","Data":"34d3baffe18c5c06bfc2bb9eafed9a786f36070f248cb09e53f749fd9e6e456a"} Apr 22 19:10:32.171043 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:32.171002 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" event={"ID":"d81db7c5-136c-426d-95e0-80ccd5b0f807","Type":"ContainerStarted","Data":"d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b"} Apr 22 19:10:32.171043 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:32.171049 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" event={"ID":"d81db7c5-136c-426d-95e0-80ccd5b0f807","Type":"ContainerStarted","Data":"d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66"} Apr 22 19:10:32.171470 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:32.171154 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:32.193603 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:32.193550 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" podStartSLOduration=3.193531974 podStartE2EDuration="3.193531974s" podCreationTimestamp="2026-04-22 19:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:10:32.190489101 +0000 UTC m=+1427.430605203" watchObservedRunningTime="2026-04-22 19:10:32.193531974 +0000 UTC m=+1427.433648078" Apr 22 19:10:39.954934 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:39.954898 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:39.954934 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:39.954944 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:39.957467 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:39.957443 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:10:40.203974 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:10:40.203945 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:11:01.207486 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:11:01.207454 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:11:45.381192 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:11:45.381150 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 19:11:45.387133 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:11:45.387112 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 19:12:28.594880 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.594840 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr"] Apr 22 19:12:28.598600 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.598578 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.601279 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.601252 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 19:12:28.601417 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.601318 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-x9l5b\"" Apr 22 19:12:28.610177 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.610134 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr"] Apr 22 19:12:28.727027 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.726986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5086c326-3298-4b53-bdd5-b1352812e1d1-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.727027 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.727035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.727295 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.727083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.727295 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.727131 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.727295 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.727148 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4llfl\" (UniqueName: \"kubernetes.io/projected/5086c326-3298-4b53-bdd5-b1352812e1d1-kube-api-access-4llfl\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.727295 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.727198 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.827589 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.827550 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.827768 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.827598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5086c326-3298-4b53-bdd5-b1352812e1d1-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.827768 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.827628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.827768 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.827661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.827967 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.827889 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.827967 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.827931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4llfl\" (UniqueName: \"kubernetes.io/projected/5086c326-3298-4b53-bdd5-b1352812e1d1-kube-api-access-4llfl\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.828080 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.828001 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.828080 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.828030 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.828194 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.828073 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.828264 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.828245 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.831090 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.831067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5086c326-3298-4b53-bdd5-b1352812e1d1-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.835761 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.835735 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4llfl\" (UniqueName: \"kubernetes.io/projected/5086c326-3298-4b53-bdd5-b1352812e1d1-kube-api-access-4llfl\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:28.911515 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:28.911484 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:29.038794 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:29.038733 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr"] Apr 22 19:12:29.041430 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:12:29.041395 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5086c326_3298_4b53_bdd5_b1352812e1d1.slice/crio-cd04f26ab79934008270c0c571c8801713fb030b442f0664ee4bda7ecfc5ee1b WatchSource:0}: Error finding container cd04f26ab79934008270c0c571c8801713fb030b442f0664ee4bda7ecfc5ee1b: Status 404 returned error can't find the container with id cd04f26ab79934008270c0c571c8801713fb030b442f0664ee4bda7ecfc5ee1b Apr 22 19:12:29.570639 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:29.570605 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" event={"ID":"5086c326-3298-4b53-bdd5-b1352812e1d1","Type":"ContainerStarted","Data":"1e41adb449f98b03ae1a471cffec1d51f6f31cb88ca5f79675a54b6892cd993f"} Apr 22 19:12:29.570639 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:29.570641 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" event={"ID":"5086c326-3298-4b53-bdd5-b1352812e1d1","Type":"ContainerStarted","Data":"cd04f26ab79934008270c0c571c8801713fb030b442f0664ee4bda7ecfc5ee1b"} Apr 22 19:12:30.575434 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:30.575397 2573 generic.go:358] "Generic (PLEG): container finished" podID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerID="1e41adb449f98b03ae1a471cffec1d51f6f31cb88ca5f79675a54b6892cd993f" exitCode=0 Apr 22 19:12:30.575876 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:30.575494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" event={"ID":"5086c326-3298-4b53-bdd5-b1352812e1d1","Type":"ContainerDied","Data":"1e41adb449f98b03ae1a471cffec1d51f6f31cb88ca5f79675a54b6892cd993f"} Apr 22 19:12:31.581420 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:31.581313 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" event={"ID":"5086c326-3298-4b53-bdd5-b1352812e1d1","Type":"ContainerStarted","Data":"d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05"} Apr 22 19:12:31.581420 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:31.581350 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" event={"ID":"5086c326-3298-4b53-bdd5-b1352812e1d1","Type":"ContainerStarted","Data":"73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17"} Apr 22 19:12:31.599021 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:31.581498 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:31.601061 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:31.601024 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" podStartSLOduration=3.6010102 podStartE2EDuration="3.6010102s" podCreationTimestamp="2026-04-22 19:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:12:31.599848644 +0000 UTC m=+1546.839964810" watchObservedRunningTime="2026-04-22 19:12:31.6010102 +0000 UTC m=+1546.841126283" Apr 22 19:12:38.912225 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:38.912187 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:38.912690 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:38.912241 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:38.914713 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:38.914687 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:39.609616 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:39.609582 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:12:53.211703 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:53.211674 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm"] Apr 22 19:12:53.212208 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:53.211971 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" podUID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerName="main" containerID="cri-o://d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66" gracePeriod=30 Apr 22 19:12:53.212208 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:53.212009 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" podUID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerName="tokenizer" containerID="cri-o://d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b" gracePeriod=30 Apr 22 19:12:53.655858 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:53.655826 2573 generic.go:358] "Generic (PLEG): container finished" podID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerID="d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66" exitCode=0 Apr 22 19:12:53.656032 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:53.655910 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" event={"ID":"d81db7c5-136c-426d-95e0-80ccd5b0f807","Type":"ContainerDied","Data":"d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66"} Apr 22 19:12:54.459578 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.459557 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:12:54.542965 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.542894 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d81db7c5-136c-426d-95e0-80ccd5b0f807-tls-certs\") pod \"d81db7c5-136c-426d-95e0-80ccd5b0f807\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " Apr 22 19:12:54.542965 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.542934 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-uds\") pod \"d81db7c5-136c-426d-95e0-80ccd5b0f807\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " Apr 22 19:12:54.542965 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.542958 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-kserve-provision-location\") pod \"d81db7c5-136c-426d-95e0-80ccd5b0f807\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " Apr 22 19:12:54.543272 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.542990 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hstx8\" (UniqueName: \"kubernetes.io/projected/d81db7c5-136c-426d-95e0-80ccd5b0f807-kube-api-access-hstx8\") pod \"d81db7c5-136c-426d-95e0-80ccd5b0f807\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " Apr 22 19:12:54.543272 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.543021 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-cache\") pod \"d81db7c5-136c-426d-95e0-80ccd5b0f807\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " Apr 22 19:12:54.543272 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.543045 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-tmp\") pod \"d81db7c5-136c-426d-95e0-80ccd5b0f807\" (UID: \"d81db7c5-136c-426d-95e0-80ccd5b0f807\") " Apr 22 19:12:54.543272 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.543238 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d81db7c5-136c-426d-95e0-80ccd5b0f807" (UID: "d81db7c5-136c-426d-95e0-80ccd5b0f807"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:12:54.543479 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.543336 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-uds\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:12:54.543479 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.543338 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d81db7c5-136c-426d-95e0-80ccd5b0f807" (UID: "d81db7c5-136c-426d-95e0-80ccd5b0f807"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:12:54.543479 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.543458 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d81db7c5-136c-426d-95e0-80ccd5b0f807" (UID: "d81db7c5-136c-426d-95e0-80ccd5b0f807"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:12:54.543831 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.543805 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d81db7c5-136c-426d-95e0-80ccd5b0f807" (UID: "d81db7c5-136c-426d-95e0-80ccd5b0f807"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:12:54.545214 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.545196 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d81db7c5-136c-426d-95e0-80ccd5b0f807-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d81db7c5-136c-426d-95e0-80ccd5b0f807" (UID: "d81db7c5-136c-426d-95e0-80ccd5b0f807"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:12:54.545281 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.545241 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d81db7c5-136c-426d-95e0-80ccd5b0f807-kube-api-access-hstx8" (OuterVolumeSpecName: "kube-api-access-hstx8") pod "d81db7c5-136c-426d-95e0-80ccd5b0f807" (UID: "d81db7c5-136c-426d-95e0-80ccd5b0f807"). InnerVolumeSpecName "kube-api-access-hstx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:12:54.644445 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.644412 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d81db7c5-136c-426d-95e0-80ccd5b0f807-tls-certs\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:12:54.644445 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.644439 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-kserve-provision-location\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:12:54.644445 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.644450 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hstx8\" (UniqueName: \"kubernetes.io/projected/d81db7c5-136c-426d-95e0-80ccd5b0f807-kube-api-access-hstx8\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:12:54.644643 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.644459 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-cache\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:12:54.644643 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.644469 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d81db7c5-136c-426d-95e0-80ccd5b0f807-tokenizer-tmp\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:12:54.661866 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.661835 2573 generic.go:358] "Generic (PLEG): container finished" podID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerID="d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b" exitCode=0 Apr 22 19:12:54.662006 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.661879 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" event={"ID":"d81db7c5-136c-426d-95e0-80ccd5b0f807","Type":"ContainerDied","Data":"d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b"} Apr 22 19:12:54.662006 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.661906 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" Apr 22 19:12:54.662006 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.661921 2573 scope.go:117] "RemoveContainer" containerID="d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b" Apr 22 19:12:54.662146 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.661909 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm" event={"ID":"d81db7c5-136c-426d-95e0-80ccd5b0f807","Type":"ContainerDied","Data":"e6a57231ea1b6f814456b2c6f3dae29f5e97837b0fad831c4fa5136e08ac6143"} Apr 22 19:12:54.669997 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.669979 2573 scope.go:117] "RemoveContainer" containerID="d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66" Apr 22 19:12:54.677081 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.677063 2573 scope.go:117] "RemoveContainer" containerID="34d3baffe18c5c06bfc2bb9eafed9a786f36070f248cb09e53f749fd9e6e456a" Apr 22 19:12:54.682672 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.682647 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm"] Apr 22 19:12:54.685292 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.685267 2573 scope.go:117] "RemoveContainer" containerID="d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b" Apr 22 19:12:54.685583 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:12:54.685559 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b\": container with ID starting with d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b not found: ID does not exist" containerID="d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b" Apr 22 19:12:54.685638 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.685594 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b"} err="failed to get container status \"d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b\": rpc error: code = NotFound desc = could not find container \"d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b\": container with ID starting with d3e60151bceb858f0b59925d2abb14592ee863f9361364de705758b7e276ca9b not found: ID does not exist" Apr 22 19:12:54.685638 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.685621 2573 scope.go:117] "RemoveContainer" containerID="d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66" Apr 22 19:12:54.685838 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.685814 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7cf5fzctgm"] Apr 22 19:12:54.685890 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:12:54.685875 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66\": container with ID starting with d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66 not found: ID does not exist" containerID="d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66" Apr 22 19:12:54.685926 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.685897 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66"} err="failed to get container status \"d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66\": rpc error: code = NotFound desc = could not find container \"d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66\": container with ID starting with d625237eb9c939ed36cc9a26ef253a230165dcbd3a1e1ffd0116bba3fa8f8d66 not found: ID does not exist" Apr 22 19:12:54.685926 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.685912 2573 scope.go:117] "RemoveContainer" containerID="34d3baffe18c5c06bfc2bb9eafed9a786f36070f248cb09e53f749fd9e6e456a" Apr 22 19:12:54.686144 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:12:54.686112 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d3baffe18c5c06bfc2bb9eafed9a786f36070f248cb09e53f749fd9e6e456a\": container with ID starting with 34d3baffe18c5c06bfc2bb9eafed9a786f36070f248cb09e53f749fd9e6e456a not found: ID does not exist" containerID="34d3baffe18c5c06bfc2bb9eafed9a786f36070f248cb09e53f749fd9e6e456a" Apr 22 19:12:54.686212 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:54.686181 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d3baffe18c5c06bfc2bb9eafed9a786f36070f248cb09e53f749fd9e6e456a"} err="failed to get container status \"34d3baffe18c5c06bfc2bb9eafed9a786f36070f248cb09e53f749fd9e6e456a\": rpc error: code = NotFound desc = could not find container \"34d3baffe18c5c06bfc2bb9eafed9a786f36070f248cb09e53f749fd9e6e456a\": container with ID starting with 34d3baffe18c5c06bfc2bb9eafed9a786f36070f248cb09e53f749fd9e6e456a not found: ID does not exist" Apr 22 19:12:55.363273 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:12:55.363243 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d81db7c5-136c-426d-95e0-80ccd5b0f807" path="/var/lib/kubelet/pods/d81db7c5-136c-426d-95e0-80ccd5b0f807/volumes" Apr 22 19:13:00.614553 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:13:00.614525 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:14:26.296157 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:26.296122 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr"] Apr 22 19:14:26.296615 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:26.296562 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" podUID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerName="main" containerID="cri-o://73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17" gracePeriod=30 Apr 22 19:14:26.296691 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:26.296625 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" podUID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerName="tokenizer" containerID="cri-o://d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05" gracePeriod=30 Apr 22 19:14:26.975860 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:26.975825 2573 generic.go:358] "Generic (PLEG): container finished" podID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerID="73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17" exitCode=0 Apr 22 19:14:26.976031 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:26.975900 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" event={"ID":"5086c326-3298-4b53-bdd5-b1352812e1d1","Type":"ContainerDied","Data":"73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17"} Apr 22 19:14:27.451800 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.451778 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:14:27.481421 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.481395 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-cache\") pod \"5086c326-3298-4b53-bdd5-b1352812e1d1\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " Apr 22 19:14:27.481567 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.481464 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-uds\") pod \"5086c326-3298-4b53-bdd5-b1352812e1d1\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " Apr 22 19:14:27.481567 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.481488 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-tmp\") pod \"5086c326-3298-4b53-bdd5-b1352812e1d1\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " Apr 22 19:14:27.481567 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.481519 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-kserve-provision-location\") pod \"5086c326-3298-4b53-bdd5-b1352812e1d1\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " Apr 22 19:14:27.481567 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.481554 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4llfl\" (UniqueName: \"kubernetes.io/projected/5086c326-3298-4b53-bdd5-b1352812e1d1-kube-api-access-4llfl\") pod \"5086c326-3298-4b53-bdd5-b1352812e1d1\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " Apr 22 19:14:27.481759 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.481600 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5086c326-3298-4b53-bdd5-b1352812e1d1-tls-certs\") pod \"5086c326-3298-4b53-bdd5-b1352812e1d1\" (UID: \"5086c326-3298-4b53-bdd5-b1352812e1d1\") " Apr 22 19:14:27.481759 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.481723 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "5086c326-3298-4b53-bdd5-b1352812e1d1" (UID: "5086c326-3298-4b53-bdd5-b1352812e1d1"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:27.481871 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.481755 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "5086c326-3298-4b53-bdd5-b1352812e1d1" (UID: "5086c326-3298-4b53-bdd5-b1352812e1d1"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:27.481871 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.481856 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "5086c326-3298-4b53-bdd5-b1352812e1d1" (UID: "5086c326-3298-4b53-bdd5-b1352812e1d1"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:27.481970 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.481956 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-uds\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:14:27.482027 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.481974 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-cache\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:14:27.482560 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.482514 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5086c326-3298-4b53-bdd5-b1352812e1d1" (UID: "5086c326-3298-4b53-bdd5-b1352812e1d1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:27.484133 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.484108 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5086c326-3298-4b53-bdd5-b1352812e1d1-kube-api-access-4llfl" (OuterVolumeSpecName: "kube-api-access-4llfl") pod "5086c326-3298-4b53-bdd5-b1352812e1d1" (UID: "5086c326-3298-4b53-bdd5-b1352812e1d1"). InnerVolumeSpecName "kube-api-access-4llfl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:14:27.484380 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.484358 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5086c326-3298-4b53-bdd5-b1352812e1d1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5086c326-3298-4b53-bdd5-b1352812e1d1" (UID: "5086c326-3298-4b53-bdd5-b1352812e1d1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:14:27.525518 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525484 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5g9q5/must-gather-8dcsw"] Apr 22 19:14:27.525789 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525779 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerName="tokenizer" Apr 22 19:14:27.525844 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525791 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerName="tokenizer" Apr 22 19:14:27.525844 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525800 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerName="main" Apr 22 19:14:27.525844 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525806 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerName="main" Apr 22 19:14:27.525844 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525826 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerName="main" Apr 22 19:14:27.525844 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525832 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerName="main" Apr 22 19:14:27.525844 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525841 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerName="storage-initializer" Apr 22 19:14:27.525844 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525846 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerName="storage-initializer" Apr 22 19:14:27.526047 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525856 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerName="storage-initializer" Apr 22 19:14:27.526047 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525861 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerName="storage-initializer" Apr 22 19:14:27.526047 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525868 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerName="tokenizer" Apr 22 19:14:27.526047 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525873 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerName="tokenizer" Apr 22 19:14:27.526047 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525939 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerName="main" Apr 22 19:14:27.526047 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525949 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerName="main" Apr 22 19:14:27.526047 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525956 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d81db7c5-136c-426d-95e0-80ccd5b0f807" containerName="tokenizer" Apr 22 19:14:27.526047 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.525962 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerName="tokenizer" Apr 22 19:14:27.530005 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.529986 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" Apr 22 19:14:27.532458 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.532427 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5g9q5\"/\"kube-root-ca.crt\"" Apr 22 19:14:27.532549 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.532427 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5g9q5\"/\"openshift-service-ca.crt\"" Apr 22 19:14:27.532549 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.532471 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5g9q5\"/\"default-dockercfg-dmxcc\"" Apr 22 19:14:27.537682 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.537660 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5g9q5/must-gather-8dcsw"] Apr 22 19:14:27.582841 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.582809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/83f211ab-4d08-4be1-8bde-e7dd691636bb-must-gather-output\") pod \"must-gather-8dcsw\" (UID: \"83f211ab-4d08-4be1-8bde-e7dd691636bb\") " pod="openshift-must-gather-5g9q5/must-gather-8dcsw" Apr 22 19:14:27.582841 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.582842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqtwd\" (UniqueName: \"kubernetes.io/projected/83f211ab-4d08-4be1-8bde-e7dd691636bb-kube-api-access-rqtwd\") pod \"must-gather-8dcsw\" (UID: \"83f211ab-4d08-4be1-8bde-e7dd691636bb\") " pod="openshift-must-gather-5g9q5/must-gather-8dcsw" Apr 22 19:14:27.583025 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.582878 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5086c326-3298-4b53-bdd5-b1352812e1d1-tls-certs\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:14:27.583025 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.582889 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-tokenizer-tmp\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:14:27.583025 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.582897 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5086c326-3298-4b53-bdd5-b1352812e1d1-kserve-provision-location\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:14:27.583025 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.582906 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4llfl\" (UniqueName: \"kubernetes.io/projected/5086c326-3298-4b53-bdd5-b1352812e1d1-kube-api-access-4llfl\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:14:27.683898 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.683867 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/83f211ab-4d08-4be1-8bde-e7dd691636bb-must-gather-output\") pod \"must-gather-8dcsw\" (UID: \"83f211ab-4d08-4be1-8bde-e7dd691636bb\") " pod="openshift-must-gather-5g9q5/must-gather-8dcsw" Apr 22 19:14:27.683898 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.683901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqtwd\" (UniqueName: \"kubernetes.io/projected/83f211ab-4d08-4be1-8bde-e7dd691636bb-kube-api-access-rqtwd\") pod \"must-gather-8dcsw\" (UID: \"83f211ab-4d08-4be1-8bde-e7dd691636bb\") " pod="openshift-must-gather-5g9q5/must-gather-8dcsw" Apr 22 19:14:27.684222 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.684204 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/83f211ab-4d08-4be1-8bde-e7dd691636bb-must-gather-output\") pod \"must-gather-8dcsw\" (UID: \"83f211ab-4d08-4be1-8bde-e7dd691636bb\") " pod="openshift-must-gather-5g9q5/must-gather-8dcsw" Apr 22 19:14:27.691528 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.691497 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqtwd\" (UniqueName: \"kubernetes.io/projected/83f211ab-4d08-4be1-8bde-e7dd691636bb-kube-api-access-rqtwd\") pod \"must-gather-8dcsw\" (UID: \"83f211ab-4d08-4be1-8bde-e7dd691636bb\") " pod="openshift-must-gather-5g9q5/must-gather-8dcsw" Apr 22 19:14:27.839468 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.839391 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" Apr 22 19:14:27.958228 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.958188 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5g9q5/must-gather-8dcsw"] Apr 22 19:14:27.960409 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:14:27.960386 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83f211ab_4d08_4be1_8bde_e7dd691636bb.slice/crio-8984e37bfd6816e6595e66edd94f1dea8fc547aaacfcbf84ca3536b2dc0fda08 WatchSource:0}: Error finding container 8984e37bfd6816e6595e66edd94f1dea8fc547aaacfcbf84ca3536b2dc0fda08: Status 404 returned error can't find the container with id 8984e37bfd6816e6595e66edd94f1dea8fc547aaacfcbf84ca3536b2dc0fda08 Apr 22 19:14:27.980643 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.980608 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" event={"ID":"83f211ab-4d08-4be1-8bde-e7dd691636bb","Type":"ContainerStarted","Data":"8984e37bfd6816e6595e66edd94f1dea8fc547aaacfcbf84ca3536b2dc0fda08"} Apr 22 19:14:27.982131 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.982105 2573 generic.go:358] "Generic (PLEG): container finished" podID="5086c326-3298-4b53-bdd5-b1352812e1d1" containerID="d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05" exitCode=0 Apr 22 19:14:27.982237 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.982148 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" event={"ID":"5086c326-3298-4b53-bdd5-b1352812e1d1","Type":"ContainerDied","Data":"d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05"} Apr 22 19:14:27.982237 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.982194 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" event={"ID":"5086c326-3298-4b53-bdd5-b1352812e1d1","Type":"ContainerDied","Data":"cd04f26ab79934008270c0c571c8801713fb030b442f0664ee4bda7ecfc5ee1b"} Apr 22 19:14:27.982237 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.982208 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr" Apr 22 19:14:27.982237 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.982215 2573 scope.go:117] "RemoveContainer" containerID="d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05" Apr 22 19:14:27.990307 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.990288 2573 scope.go:117] "RemoveContainer" containerID="73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17" Apr 22 19:14:27.997747 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:27.997730 2573 scope.go:117] "RemoveContainer" containerID="1e41adb449f98b03ae1a471cffec1d51f6f31cb88ca5f79675a54b6892cd993f" Apr 22 19:14:28.004985 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:28.004965 2573 scope.go:117] "RemoveContainer" containerID="d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05" Apr 22 19:14:28.005056 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:28.004967 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr"] Apr 22 19:14:28.005361 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:14:28.005344 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05\": container with ID starting with d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05 not found: ID does not exist" containerID="d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05" Apr 22 19:14:28.005419 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:28.005370 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05"} err="failed to get container status \"d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05\": rpc error: code = NotFound desc = could not find container \"d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05\": container with ID starting with d974de68b93390e266b0ab1d5527380177940f65285602d65d8af1fb47ec8b05 not found: ID does not exist" Apr 22 19:14:28.005419 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:28.005386 2573 scope.go:117] "RemoveContainer" containerID="73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17" Apr 22 19:14:28.008007 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:14:28.007979 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17\": container with ID starting with 73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17 not found: ID does not exist" containerID="73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17" Apr 22 19:14:28.008092 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:28.008014 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17"} err="failed to get container status \"73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17\": rpc error: code = NotFound desc = could not find container \"73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17\": container with ID starting with 73237e8d51648d8b13e2c67e52f02fa1b3e799570e9406521526f79544d19d17 not found: ID does not exist" Apr 22 19:14:28.008092 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:28.008034 2573 scope.go:117] "RemoveContainer" containerID="1e41adb449f98b03ae1a471cffec1d51f6f31cb88ca5f79675a54b6892cd993f" Apr 22 19:14:28.008486 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:14:28.008428 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e41adb449f98b03ae1a471cffec1d51f6f31cb88ca5f79675a54b6892cd993f\": container with ID starting with 1e41adb449f98b03ae1a471cffec1d51f6f31cb88ca5f79675a54b6892cd993f not found: ID does not exist" containerID="1e41adb449f98b03ae1a471cffec1d51f6f31cb88ca5f79675a54b6892cd993f" Apr 22 19:14:28.008589 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:28.008486 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e41adb449f98b03ae1a471cffec1d51f6f31cb88ca5f79675a54b6892cd993f"} err="failed to get container status \"1e41adb449f98b03ae1a471cffec1d51f6f31cb88ca5f79675a54b6892cd993f\": rpc error: code = NotFound desc = could not find container \"1e41adb449f98b03ae1a471cffec1d51f6f31cb88ca5f79675a54b6892cd993f\": container with ID starting with 1e41adb449f98b03ae1a471cffec1d51f6f31cb88ca5f79675a54b6892cd993f not found: ID does not exist" Apr 22 19:14:28.010648 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:28.010629 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schec8bcr"] Apr 22 19:14:29.363753 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:29.363720 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5086c326-3298-4b53-bdd5-b1352812e1d1" path="/var/lib/kubelet/pods/5086c326-3298-4b53-bdd5-b1352812e1d1/volumes" Apr 22 19:14:34.009161 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:34.009120 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" event={"ID":"83f211ab-4d08-4be1-8bde-e7dd691636bb","Type":"ContainerStarted","Data":"b5eedf2d560a66a513c20ab5d7087e220a8047bf63b6c2589093eb03fef3c7b9"} Apr 22 19:14:34.009161 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:34.009180 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" event={"ID":"83f211ab-4d08-4be1-8bde-e7dd691636bb","Type":"ContainerStarted","Data":"71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd"} Apr 22 19:14:34.023357 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:34.023315 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" podStartSLOduration=1.964657241 podStartE2EDuration="7.023300351s" podCreationTimestamp="2026-04-22 19:14:27 +0000 UTC" firstStartedPulling="2026-04-22 19:14:27.962006831 +0000 UTC m=+1663.202122911" lastFinishedPulling="2026-04-22 19:14:33.020649934 +0000 UTC m=+1668.260766021" observedRunningTime="2026-04-22 19:14:34.022513004 +0000 UTC m=+1669.262629110" watchObservedRunningTime="2026-04-22 19:14:34.023300351 +0000 UTC m=+1669.263416453" Apr 22 19:14:54.926322 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:54.926288 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-m6qfm_a4eb3066-1ffd-4e80-8482-976812d9d008/discovery/0.log" Apr 22 19:14:55.713585 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:55.713555 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-m6qfm_a4eb3066-1ffd-4e80-8482-976812d9d008/discovery/0.log" Apr 22 19:14:56.489364 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:56.489336 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-hz6x8_18c72311-fe1f-4302-81e4-3b4f20fc0097/manager/0.log" Apr 22 19:14:56.578474 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:56.578442 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-xxvgm_9b1066c7-5168-4d2c-80f3-4994f76a6be0/manager/0.log" Apr 22 19:14:56.601059 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:56.601037 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-xchc5_c58e17ad-8347-4144-bc44-fdfebd91e686/manager/0.log" Apr 22 19:14:58.093508 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:58.093477 2573 generic.go:358] "Generic (PLEG): container finished" podID="83f211ab-4d08-4be1-8bde-e7dd691636bb" containerID="71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd" exitCode=0 Apr 22 19:14:58.093918 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:58.093549 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" event={"ID":"83f211ab-4d08-4be1-8bde-e7dd691636bb","Type":"ContainerDied","Data":"71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd"} Apr 22 19:14:58.093918 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:58.093879 2573 scope.go:117] "RemoveContainer" containerID="71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd" Apr 22 19:14:58.207793 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:14:58.207764 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5g9q5_must-gather-8dcsw_83f211ab-4d08-4be1-8bde-e7dd691636bb/gather/0.log" Apr 22 19:15:01.707464 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:01.707426 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-d4kbs_0c9de063-c082-4665-b7f0-97a86598f6a1/global-pull-secret-syncer/0.log" Apr 22 19:15:01.791860 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:01.791829 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tkwkf_6ff01b97-8ec7-4056-9191-72e56fe99653/konnectivity-agent/0.log" Apr 22 19:15:01.863320 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:01.863295 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-84.ec2.internal_531ef67c8f8283d29f42b90580fc5209/haproxy/0.log" Apr 22 19:15:03.679574 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:03.679496 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5g9q5/must-gather-8dcsw"] Apr 22 19:15:03.679956 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:03.679703 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" containerName="copy" containerID="cri-o://b5eedf2d560a66a513c20ab5d7087e220a8047bf63b6c2589093eb03fef3c7b9" gracePeriod=2 Apr 22 19:15:03.682314 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:03.682280 2573 status_manager.go:895] "Failed to get status for pod" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" err="pods \"must-gather-8dcsw\" is forbidden: User \"system:node:ip-10-0-133-84.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5g9q5\": no relationship found between node 'ip-10-0-133-84.ec2.internal' and this object" Apr 22 19:15:03.682867 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:03.682846 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5g9q5/must-gather-8dcsw"] Apr 22 19:15:03.910937 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:03.910912 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5g9q5_must-gather-8dcsw_83f211ab-4d08-4be1-8bde-e7dd691636bb/copy/0.log" Apr 22 19:15:03.911289 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:03.911274 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" Apr 22 19:15:03.913289 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:03.913266 2573 status_manager.go:895] "Failed to get status for pod" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" err="pods \"must-gather-8dcsw\" is forbidden: User \"system:node:ip-10-0-133-84.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5g9q5\": no relationship found between node 'ip-10-0-133-84.ec2.internal' and this object" Apr 22 19:15:04.100262 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.100160 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/83f211ab-4d08-4be1-8bde-e7dd691636bb-must-gather-output\") pod \"83f211ab-4d08-4be1-8bde-e7dd691636bb\" (UID: \"83f211ab-4d08-4be1-8bde-e7dd691636bb\") " Apr 22 19:15:04.100262 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.100263 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqtwd\" (UniqueName: \"kubernetes.io/projected/83f211ab-4d08-4be1-8bde-e7dd691636bb-kube-api-access-rqtwd\") pod \"83f211ab-4d08-4be1-8bde-e7dd691636bb\" (UID: \"83f211ab-4d08-4be1-8bde-e7dd691636bb\") " Apr 22 19:15:04.102646 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.102611 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f211ab-4d08-4be1-8bde-e7dd691636bb-kube-api-access-rqtwd" (OuterVolumeSpecName: "kube-api-access-rqtwd") pod "83f211ab-4d08-4be1-8bde-e7dd691636bb" (UID: "83f211ab-4d08-4be1-8bde-e7dd691636bb"). InnerVolumeSpecName "kube-api-access-rqtwd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:15:04.103667 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.103644 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f211ab-4d08-4be1-8bde-e7dd691636bb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "83f211ab-4d08-4be1-8bde-e7dd691636bb" (UID: "83f211ab-4d08-4be1-8bde-e7dd691636bb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:15:04.115004 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.114978 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5g9q5_must-gather-8dcsw_83f211ab-4d08-4be1-8bde-e7dd691636bb/copy/0.log" Apr 22 19:15:04.115338 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.115314 2573 generic.go:358] "Generic (PLEG): container finished" podID="83f211ab-4d08-4be1-8bde-e7dd691636bb" containerID="b5eedf2d560a66a513c20ab5d7087e220a8047bf63b6c2589093eb03fef3c7b9" exitCode=143 Apr 22 19:15:04.115395 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.115368 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" Apr 22 19:15:04.115442 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.115427 2573 scope.go:117] "RemoveContainer" containerID="b5eedf2d560a66a513c20ab5d7087e220a8047bf63b6c2589093eb03fef3c7b9" Apr 22 19:15:04.117518 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.117494 2573 status_manager.go:895] "Failed to get status for pod" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" err="pods \"must-gather-8dcsw\" is forbidden: User \"system:node:ip-10-0-133-84.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5g9q5\": no relationship found between node 'ip-10-0-133-84.ec2.internal' and this object" Apr 22 19:15:04.123010 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.122987 2573 scope.go:117] "RemoveContainer" containerID="71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd" Apr 22 19:15:04.125743 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.125721 2573 status_manager.go:895] "Failed to get status for pod" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" err="pods \"must-gather-8dcsw\" is forbidden: User \"system:node:ip-10-0-133-84.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5g9q5\": no relationship found between node 'ip-10-0-133-84.ec2.internal' and this object" Apr 22 19:15:04.135646 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.135626 2573 scope.go:117] "RemoveContainer" containerID="b5eedf2d560a66a513c20ab5d7087e220a8047bf63b6c2589093eb03fef3c7b9" Apr 22 19:15:04.135912 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:15:04.135893 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5eedf2d560a66a513c20ab5d7087e220a8047bf63b6c2589093eb03fef3c7b9\": container with ID starting with b5eedf2d560a66a513c20ab5d7087e220a8047bf63b6c2589093eb03fef3c7b9 not found: ID does not exist" containerID="b5eedf2d560a66a513c20ab5d7087e220a8047bf63b6c2589093eb03fef3c7b9" Apr 22 19:15:04.135981 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.135923 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5eedf2d560a66a513c20ab5d7087e220a8047bf63b6c2589093eb03fef3c7b9"} err="failed to get container status \"b5eedf2d560a66a513c20ab5d7087e220a8047bf63b6c2589093eb03fef3c7b9\": rpc error: code = NotFound desc = could not find container \"b5eedf2d560a66a513c20ab5d7087e220a8047bf63b6c2589093eb03fef3c7b9\": container with ID starting with b5eedf2d560a66a513c20ab5d7087e220a8047bf63b6c2589093eb03fef3c7b9 not found: ID does not exist" Apr 22 19:15:04.135981 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.135950 2573 scope.go:117] "RemoveContainer" containerID="71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd" Apr 22 19:15:04.136194 ip-10-0-133-84 kubenswrapper[2573]: E0422 19:15:04.136159 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd\": container with ID starting with 71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd not found: ID does not exist" containerID="71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd" Apr 22 19:15:04.136243 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.136202 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd"} err="failed to get container status \"71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd\": rpc error: code = NotFound desc = could not find container \"71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd\": container with ID starting with 71e28141edb644bd6764ef0b0143236bbaf6534fcfd96d6dc70f2f8c328e47cd not found: ID does not exist" Apr 22 19:15:04.201718 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.201692 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rqtwd\" (UniqueName: \"kubernetes.io/projected/83f211ab-4d08-4be1-8bde-e7dd691636bb-kube-api-access-rqtwd\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:15:04.201718 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:04.201715 2573 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/83f211ab-4d08-4be1-8bde-e7dd691636bb-must-gather-output\") on node \"ip-10-0-133-84.ec2.internal\" DevicePath \"\"" Apr 22 19:15:05.363422 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:05.363388 2573 status_manager.go:895] "Failed to get status for pod" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" pod="openshift-must-gather-5g9q5/must-gather-8dcsw" err="pods \"must-gather-8dcsw\" is forbidden: User \"system:node:ip-10-0-133-84.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5g9q5\": no relationship found between node 'ip-10-0-133-84.ec2.internal' and this object" Apr 22 19:15:05.364587 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:05.364567 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" path="/var/lib/kubelet/pods/83f211ab-4d08-4be1-8bde-e7dd691636bb/volumes" Apr 22 19:15:06.103620 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:06.103587 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-hz6x8_18c72311-fe1f-4302-81e4-3b4f20fc0097/manager/0.log" Apr 22 19:15:06.194283 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:06.194257 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-xxvgm_9b1066c7-5168-4d2c-80f3-4994f76a6be0/manager/0.log" Apr 22 19:15:06.245456 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:06.245423 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-xchc5_c58e17ad-8347-4144-bc44-fdfebd91e686/manager/0.log" Apr 22 19:15:07.366847 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:07.366816 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jqtr9_1ec1a089-f815-406b-a0ff-8ee4d88530dc/kube-state-metrics/0.log" Apr 22 19:15:07.385369 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:07.385343 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jqtr9_1ec1a089-f815-406b-a0ff-8ee4d88530dc/kube-rbac-proxy-main/0.log" Apr 22 19:15:07.408447 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:07.408428 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jqtr9_1ec1a089-f815-406b-a0ff-8ee4d88530dc/kube-rbac-proxy-self/0.log" Apr 22 19:15:07.581108 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:07.581083 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kjj54_68fb2641-d900-4570-9106-dfa68f2a21a2/node-exporter/0.log" Apr 22 19:15:07.609946 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:07.609925 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kjj54_68fb2641-d900-4570-9106-dfa68f2a21a2/kube-rbac-proxy/0.log" Apr 22 19:15:07.628626 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:07.628605 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kjj54_68fb2641-d900-4570-9106-dfa68f2a21a2/init-textfile/0.log" Apr 22 19:15:08.092835 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:08.092803 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-tftld_4916102b-7785-4334-9fcc-bd481df20b31/prometheus-operator-admission-webhook/0.log" Apr 22 19:15:08.187470 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:08.187401 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8fbdfc678-rs8ng_df0c630f-3b5c-4c80-a392-4f1cbddb040e/thanos-query/0.log" Apr 22 19:15:08.205879 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:08.205859 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8fbdfc678-rs8ng_df0c630f-3b5c-4c80-a392-4f1cbddb040e/kube-rbac-proxy-web/0.log" Apr 22 19:15:08.226437 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:08.226419 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8fbdfc678-rs8ng_df0c630f-3b5c-4c80-a392-4f1cbddb040e/kube-rbac-proxy/0.log" Apr 22 19:15:08.248074 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:08.248054 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8fbdfc678-rs8ng_df0c630f-3b5c-4c80-a392-4f1cbddb040e/prom-label-proxy/0.log" Apr 22 19:15:08.268425 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:08.268406 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8fbdfc678-rs8ng_df0c630f-3b5c-4c80-a392-4f1cbddb040e/kube-rbac-proxy-rules/0.log" Apr 22 19:15:08.289880 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:08.289862 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8fbdfc678-rs8ng_df0c630f-3b5c-4c80-a392-4f1cbddb040e/kube-rbac-proxy-metrics/0.log" Apr 22 19:15:10.635211 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.635154 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk"] Apr 22 19:15:10.635653 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.635611 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" containerName="gather" Apr 22 19:15:10.635653 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.635629 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" containerName="gather" Apr 22 19:15:10.635765 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.635659 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" containerName="copy" Apr 22 19:15:10.635765 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.635668 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" containerName="copy" Apr 22 19:15:10.635765 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.635747 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" containerName="copy" Apr 22 19:15:10.635765 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.635764 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="83f211ab-4d08-4be1-8bde-e7dd691636bb" containerName="gather" Apr 22 19:15:10.643597 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.643385 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.645425 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.645403 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk"] Apr 22 19:15:10.646024 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.646000 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sn5zn\"/\"openshift-service-ca.crt\"" Apr 22 19:15:10.646130 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.646008 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-sn5zn\"/\"default-dockercfg-dc758\"" Apr 22 19:15:10.647209 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.647187 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sn5zn\"/\"kube-root-ca.crt\"" Apr 22 19:15:10.741799 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.741769 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk9r8\" (UniqueName: \"kubernetes.io/projected/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-kube-api-access-gk9r8\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.741799 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.741804 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-podres\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.741995 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.741839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-sys\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.741995 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.741872 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-proc\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.741995 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.741920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-lib-modules\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.843261 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.843235 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk9r8\" (UniqueName: \"kubernetes.io/projected/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-kube-api-access-gk9r8\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.843433 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.843270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-podres\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.843433 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.843293 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-sys\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.843433 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.843310 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-proc\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.843433 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.843381 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-lib-modules\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.843433 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.843387 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-sys\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.843433 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.843406 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-podres\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.843803 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.843502 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-proc\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.843803 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.843532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-lib-modules\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.850847 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.850830 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk9r8\" (UniqueName: \"kubernetes.io/projected/d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8-kube-api-access-gk9r8\") pod \"perf-node-gather-daemonset-8cmzk\" (UID: \"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:10.954452 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:10.954389 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:11.073614 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:11.073575 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk"] Apr 22 19:15:11.076646 ip-10-0-133-84 kubenswrapper[2573]: W0422 19:15:11.076616 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd8ebe5c1_8b2c_4d89_93a3_7cea8183f8a8.slice/crio-c3b999a624266fe68fb01ab6cc19c0e0e691daf358d7e68f7214321302487e71 WatchSource:0}: Error finding container c3b999a624266fe68fb01ab6cc19c0e0e691daf358d7e68f7214321302487e71: Status 404 returned error can't find the container with id c3b999a624266fe68fb01ab6cc19c0e0e691daf358d7e68f7214321302487e71 Apr 22 19:15:11.139230 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:11.139201 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" event={"ID":"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8","Type":"ContainerStarted","Data":"c3b999a624266fe68fb01ab6cc19c0e0e691daf358d7e68f7214321302487e71"} Apr 22 19:15:11.565710 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:11.565677 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-q62qx_39993c75-c30d-49df-8bc3-5e7250217350/dns/0.log" Apr 22 19:15:11.584187 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:11.584141 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-q62qx_39993c75-c30d-49df-8bc3-5e7250217350/kube-rbac-proxy/0.log" Apr 22 19:15:11.686537 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:11.686505 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sklxm_c935a1fa-6e87-4c17-b437-53842e8bf6b5/dns-node-resolver/0.log" Apr 22 19:15:12.120349 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:12.120318 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fsm87_613626ea-ce6e-4907-a058-fed79d83cc79/node-ca/0.log" Apr 22 19:15:12.143528 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:12.143498 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" event={"ID":"d8ebe5c1-8b2c-4d89-93a3-7cea8183f8a8","Type":"ContainerStarted","Data":"b3fc7b811974430420fd574cf51ea31454c73f9d5dd4c9475ab845d8a9372744"} Apr 22 19:15:12.143690 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:12.143677 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:12.910547 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:12.910503 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-m6qfm_a4eb3066-1ffd-4e80-8482-976812d9d008/discovery/0.log" Apr 22 19:15:13.390655 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:13.390618 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-sd2xm_2a601b18-e341-409a-8702-534bf559976c/serve-healthcheck-canary/0.log" Apr 22 19:15:13.818536 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:13.818442 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-28bz4_4e9618b6-810d-461a-ac5f-ce2cacfcc950/kube-rbac-proxy/0.log" Apr 22 19:15:13.837255 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:13.837232 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-28bz4_4e9618b6-810d-461a-ac5f-ce2cacfcc950/exporter/0.log" Apr 22 19:15:13.857352 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:13.857330 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-28bz4_4e9618b6-810d-461a-ac5f-ce2cacfcc950/extractor/0.log" Apr 22 19:15:16.517308 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:16.517281 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-bhbch_43ebccbc-000c-4ebe-97b0-7c1fc69bebd5/openshift-lws-operator/0.log" Apr 22 19:15:17.063662 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:17.063635 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-wtfk9_4858a3cd-5f36-4406-9da0-7fe15c84d8b5/server/0.log" Apr 22 19:15:18.156742 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:18.156716 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" Apr 22 19:15:18.173380 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:18.173325 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-8cmzk" podStartSLOduration=8.173311505 podStartE2EDuration="8.173311505s" podCreationTimestamp="2026-04-22 19:15:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:15:12.15772461 +0000 UTC m=+1707.397840712" watchObservedRunningTime="2026-04-22 19:15:18.173311505 +0000 UTC m=+1713.413427606" Apr 22 19:15:22.938016 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:22.937985 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9qjxs_d278e3ab-ea06-4b10-bfb7-327499648b8a/kube-multus-additional-cni-plugins/0.log" Apr 22 19:15:22.956886 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:22.956860 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9qjxs_d278e3ab-ea06-4b10-bfb7-327499648b8a/egress-router-binary-copy/0.log" Apr 22 19:15:22.981969 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:22.981949 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9qjxs_d278e3ab-ea06-4b10-bfb7-327499648b8a/cni-plugins/0.log" Apr 22 19:15:23.000867 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:23.000842 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9qjxs_d278e3ab-ea06-4b10-bfb7-327499648b8a/bond-cni-plugin/0.log" Apr 22 19:15:23.020841 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:23.020816 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9qjxs_d278e3ab-ea06-4b10-bfb7-327499648b8a/routeoverride-cni/0.log" Apr 22 19:15:23.042561 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:23.042538 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9qjxs_d278e3ab-ea06-4b10-bfb7-327499648b8a/whereabouts-cni-bincopy/0.log" Apr 22 19:15:23.061426 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:23.061404 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9qjxs_d278e3ab-ea06-4b10-bfb7-327499648b8a/whereabouts-cni/0.log" Apr 22 19:15:23.419676 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:23.419648 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rdcjp_6583213e-9d3a-4b3f-b477-300fa1ff26c2/kube-multus/0.log" Apr 22 19:15:23.560687 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:23.560653 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mjbsn_c87517bd-8a13-4cb2-bf88-0b3d8c58b67c/network-metrics-daemon/0.log" Apr 22 19:15:23.579945 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:23.579914 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mjbsn_c87517bd-8a13-4cb2-bf88-0b3d8c58b67c/kube-rbac-proxy/0.log" Apr 22 19:15:24.379578 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:24.379523 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-controller/0.log" Apr 22 19:15:24.407308 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:24.407273 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/0.log" Apr 22 19:15:24.413993 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:24.413974 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovn-acl-logging/1.log" Apr 22 19:15:24.445541 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:24.445521 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/kube-rbac-proxy-node/0.log" Apr 22 19:15:24.481877 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:24.481854 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:15:24.513002 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:24.512983 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/northd/0.log" Apr 22 19:15:24.551020 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:24.551001 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/nbdb/0.log" Apr 22 19:15:24.585625 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:24.585609 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/sbdb/0.log" Apr 22 19:15:24.696523 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:24.696457 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cr6wp_bbfc7d3e-a6f6-4faa-90ed-6028f3ed1278/ovnkube-controller/0.log" Apr 22 19:15:26.343117 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:26.343092 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-cxglm_54689db5-4b53-4548-b0fc-1a5da6d4dbcc/network-check-target-container/0.log" Apr 22 19:15:27.330275 ip-10-0-133-84 kubenswrapper[2573]: I0422 19:15:27.330245 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-wp4tk_67bcb723-894e-4da7-a40d-b8def8103a95/iptables-alerter/0.log"